Deadline Date:
Tuesday, 4 March 2025
Requirement Title: Data Engineer
Location:
The Hague, NETHERLANDS
Full time on-site: Yes
Not to Exceed:
87 Euro
Total Scope of the request (hours):
836
Required Start Date:
13-APR-2025
Required Security Clearance:
NATO Secret
Specific Working Conditions:
Normal office environment
Duties and Role
- Designs and implements data pipelines and data stores to acquire and prepare data.
- Applies data engineering standards and tools to create and maintain data pipelines and extract, transform and load data.
- Carries out routine data quality checks and remediation. Programming/software development
- Designs, codes, verifies, tests, documents, amends and refactors moderately complex programs/scripts.
- Applies agreed standards and tools to achieve a well-engineered result.
- Identifies issues related to software development activities.
- Proposes practical solutions to resolve issues.
- Collaborates in reviews of work with others as appropriate. Systems integration and build
- Defines the software modules needed for an integration build and produces a build definition for each generation of the software. Data management
- Assesses the integrity of data from multiple sources.
- Provides advice on the transformation of data from one format/medium to another.
- Enables the availability, integrity and search ability of information through the application of formal data and metadata structures and protection measures. Data modelling and design
- Applies standard data modelling and design techniques based upon a detailed understanding of requirements.
- Establishes, modifies and maintains data structures and associated components.
- Communicates the details of data structures and associated components to others using the data structures and associated components.
- NATO Secret clearance
- Ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management.
- At least two years' experience planning and maintaining data lakes and pipeline processes, both batch-wise and (near) realtime;
- Experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using data integration technologies.
- Experience in using of data engineering tools to support data science, data analytics and data visualisation
- Experience with state-of-the-art data engineering, analytics, data integration and pipeline tools/programming language.
- Solid understanding of data security best practices, including data encryption, access controls, authentication/authorization mechanisms, and compliance requirements.
- Experience with monitoring and logging tools to ensure the reliability and performance of data pipelines and AI systems.
- Familiarity with containerization and orchestration tools such as Docker and Kubernetes.
- Knowledge of best practice of software development. E.g CI/CD pipelines, unit/functional testing, following styling guides.