Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.
Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...
Work you'll do/Responsibilities
- Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
- Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
- Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
- Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming.
- Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or GCP Methods.
- Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
- Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or GCP cloud platform.
- Moving data from on-prem to cloud and cloud data conversions.
The Team Artificial Intelligence & Data Engineering:
In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.
The Artificial Intelligence & Data Engineering team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.
Artificial Intelligence & Data Engineering will work with our clients to:
Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.
Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions.
Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements.
Qualifications Required
- 3+ years of experience in data engineering with an emphasis on data analytics and reporting.
- 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP), others.
- 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Data Warehouse, Databricks, etc.).
- 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
- 3+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other.
- 3+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
- Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline, or equivalent experience
- Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future.
- Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve. This may include overnight travel.
- Must be able to obtain the required level of security clearance for this role
- Must live in a commutable distance (approximately 100-mile radius) to one of the following Delivery locations: Atlanta, GA; Charlotte, NC; Dallas, TX; Gilbert, AZ; Houston, TX; Lake Mary, FL; Mechanicsburg, PA; Philadelphia, PA with the ability to commute to assigned location for the day, without the need for overnight accommodations
- Expectation to co-locate in your designated Delivery location up to 30% of the time based on business needs. This may include a maximum of 10% overnight client/project travel
Preferred
- AWS, Azure and/or Google Cloud Platform Certification.
- Master's degree or higher.
- Expertise in one or more programming languages, preferably Scala, PySpark and/or Python.
- Experience working with either a Map Reduce or an MPP system on any size/scale.
- Experience working with agile development methodologies such as Sprint and Scrum.
- Must be able to obtain the required level of security clearance for this role
Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html