MANTECH seeks a motivated, career and customer-oriented
Data Engineer to join our team in
Fayetteville, NC.
In this role, you will contribute to a mission-focused team that will support military operators and mission partners by leading the design, development, and maintenance of robust data architectures and pipelines to support advanced analytics and decision-making for critical operations. You will collaborate with cross-functional teams to ensure data integrity, scalability, and performance across enterprise-level systems.
Responsibilities Include But Are Not Limited To
- Designing and implementing scalable and resilient data pipelines for structured and unstructured data sources
- Developing ETL/ELT processes and integrating disparate data systems
- Optimizing data storage solutions using relational, NoSQL, and cloud-based technologies
- Building data models and data marts to support business intelligence and machine learning applications
- Collaborating with data scientists, analysts, and software engineers to support end-to-end data solutions
- Ensuring data quality, governance, and compliance with security policies
- Automating data validation, testing, and monitoring processes to maintain data health
- Identifying and implementing performance improvements for large-scale data workflows
- Establishing best practices for data engineering, data quality, and data governance to support the full lifecycle of C5ISR data.
Minimum Qualifications
- Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field OR High School and 4 years of additional experience or Associate's Degree and 2 years of additional experience may be exchanged in lieu of a required Bachelor's degree
- 9+ years of professional experience in data engineering or a related discipline
- Expertise with modern data engineering tools and frameworks (e.g., Apache Spark, Kafka, Airflow)
- Strong SQL skills and experience with Python or Scala for data manipulation
- Hands-on experience with cloud platforms such as AWS, Azure, or GCP
- Knowledge of data warehousing solutions like Snowflake, Redshift, or BigQuery
- Experience with CI/CD practices and version control systems like Git
Preferred Qualifications
- Master’s degree in Data Science, Computer Science, or related field
- Experience with containerization and orchestration tools (e.g., Docker, Kubernetes)
- Exposure to DevSecOps practices in a mission-critical environment
- Certification in cloud technology (AWS Certified Data Analytics, GCP Professional Data Engineer, etc.)
Clearance Requirements
- Active Top Secret security clearance with SCI eligibility
Physical Requirements
- The person in this position must be able to remain in a stationary position 50% of the time. Occasionally move about inside the office to access file cabinets, office machinery, or to communicate with co-workers, management, and customers, via email, phone, and or virtual communication, which may involve delivering presentations.