This role is categorized as hybrid. This means the successful candidate is expected to report to Warren, MI or Austin, TX offices three times per week, at minimum [or other frequency dictated by the business if more than 3 days].
What You’ll Do
- Communicates and maintains Master Data, Metadata, Data Management Repositories, Logical Data Models, Data Standards
- Create and maintain optimal data pipeline architecture
- You will assemble large, complex data sets that meet functional / non-functional business requirements
- You will identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build industrialized analytic datasets and delivery mechanisms that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency and other key business performance metrics
- Work with business partners on data-related technical issues and develop requirements to support their data infrastructure needs
- Create highly consistent and accurate analytic datasets suitable for business intelligence and data scientist team members
Your Skills & Abilities (Required Qualifications)
- At least 3 years of hands on experience with Big Data Tools: Hadoop, Spark, Kafka, etc.
- You have mastery with databases - Advanced SQL and NoSQL databases, including Postgres and Cassandra
- Data Wrangling and Preparation: Alteryx, Trifacta, SAS, Datameer
- Stream-processing systems: Storm, Spark-Streaming, etc.
- 7 or more years with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Ability to tackle problems quickly and completely
- Ability to identify tasks which require automation and automate them
- A demonstrable understanding of networking/distributed computing environment concepts
- Ability to multi-task and stay organized in a dynamic work environment
What Can Give You a Competitive Advantage (Preferred Qualifications)
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- AWS cloud services: EC2, EMR, RDS, Redshift
This job may be eligible for relocation benefits.
Compensation:
-
The expected base compensation for this role is: $125,000 - $205,100. Actual base compensation within the identified range will vary based on factors relevant to the position.
-
Bonus Potential: An incentive pay program offers payouts based on company performance, job level, and individual performance.
-
Benefits : GM offers a variety of health and wellbeing benefit programs. Benefit options include medical, dental, vision, Health Savings Account, Flexible Spending Accounts, retirement savings plan, sickness and accident benefits, life insurance, paid vacation & holidays, tuition assistance programs, employee assistance program, GM vehicle discounts and more.
GM DOES NOT PROVIDE IMMIGRATION-RELATED SPONSORSHIP FOR THIS ROLE. DO NOT APPLY FOR THIS ROLE IF YOU WILL NEED GM IMMIGRATION SPONSORSHIP (e.g., H-1B, TN, STEM OPT, etc.) NOW OR IN THE FUTURE.
#LI-CC1