Description
Leidos has a new and exciting opportunity for a Principal ETLDeveloper in our National Security Sector's (NSS) Cyber & Analytics Business Area (CABA). Our talented team is at the forefront in Security Engineering, Computer Network Operations (CNO), Mission Software, Analytical Methods and Modeling, Signals Intelligence (SIGINT), and Cryptographic Key Management. At Leidos, we offer competitive benefits, including Paid Time Off, 11 paid Holidays, 401K with a 6% company match and immediate vesting, Flexible Schedules, Discounted Stock Purchase Plans, Technical Upskilling, Education and Training Support, Parental Paid Leave, and much more. Join us and make a difference in National Security!
Job Description
We have an IMMEDIATE NEED for an ETL developer to play a pivotal role in shaping, leading, and implementing cutting-edge data flow solutions centered around Apache NiFi. The candidate will provide technical expertise and support in the design, development, implementation and testing of customer tools and applications in support of Extracting, Transforming and Loading (ETL) of data into an enterprise Data Lake. The candidate will be responsible for defining architectural best practices, optimizing performance in large-scale environments, and mentoring junior developers, ensuring the delivery of robust, scalable, and secure data flow solutions that drive critical customer needs. Based in a DevOps framework, the ETL Developer participates in and directs major deliverables of projects through all aspects of the software development lifecycle.
Primary Responsibilities:
Architecting complex NiFi data pipeline design: Design and develop enterprise-level ETL architectures and implement NiFi data pipelines for large-scale data ingestion, transformation, and processing from diverse sources.
Performance optimization and tuning: Optimize NiFi data flows, including processor tuning, memory management, and load balancing, ensuring optimal performance for batch and real-time processing.
Advanced troubleshooting and problem resolution: Identify, diagnose, and resolve complex NiFi data flow issues, including performance bottlenecks, data discrepancies, and integration failures.
Integrating with big data and cloud technologies: Seamlessly integrate NiFi with various databases, big data ecosystems, and cloud platforms (e.g., AWS, OCI, Azure), demonstrating expertise in relevant services (e.g., Kafka, Elasticsearch, S3, SQS/SNS).
Defining best practices and standards: Establish best practices for NiFi development, deployment, security, and governance, ensuring adherence to enterprise-wide data management policies.
Documentation and knowledge sharing: Create and maintain comprehensive documentation for NiFi data flows, mappings, architectures, and standard operating procedures, ensuring knowledge transfer and promoting efficient team operations.
Collaboration and communication: Collaborate effectively with data architects, data engineers, application/service developers, and other stakeholders to translate business requirements into robust technical solutions and effectively communicate complex technical concepts to both technical and non-technical audiences.
Mentorship and team leadership: Mentor junior developers, provide technical guidance, conduct code reviews, and foster a collaborative learning environment.
Basic Qualifications:
In-depth experience designing, developing, and managing complex NiFi data flow solutions in large-scale enterprise environments.
In-depth knowledge of NiFi architecture, processors, and configurations, along with hands-on experience with NiFi Registry and clustering for high availability and scalability.
Proficiency in programming languages like Java and Python for custom NiFi processor development and scripting for automation.
Proficiency writing and optimizing complex queries, along with experience in managing relational and NoSQL databases (e.g., Postgres, Elasticsearch, DynamoDB).
Direct experience with real-time streaming, and API integration (REST) for seamless data connectivity.
Direct experience with cloud platforms like AWS, Azure, or OCI and related data services
Strong ability to analyze complex data challenges, identify root causes, and implement effective solutions.
Strong ability to collaborate effectively with cross-functional teams, articulate technical concepts clearly, and provide effective mentorship.
Bachelor’s degree with 12 or more years of prior relevant experience or Master’s degree with 10 or more years of relevant experience. Additional years of experience may be substituted in lieu of a degree.
To be considered must have an active TS/SCI with polygraph security clearance
Preferred Qualifications:
CABARESTON
At Leidos, we don’t want someone who "fits the mold"—we want someone who melts it down and builds something better. This is a role for the restless, the over-caffeinated, the ones who ask, “what’s next?” before the dust settles on “what’s now.”
If you’re already scheming step 20 while everyone else is still debating step 2… good. You’ll fit right in.
Original Posting:
September 11, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range:
Pay Range $126,100.00 - $227,950.00
The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.