Westminster, CO 80234
- Data Integration: collect, store, and aggregate data to support the creation of great data products for the business and our customers
- Data Access: build data APIs, access controls, data catalogs and metadata structures to support enterprise-scale data discovery and sharing.
- Data Quality: Validate data and maintain healthy infrastructure and data flows
- Thought Leadership: contribute best practices and innovation leadership toward realizing Maxar’s data maturity goals. Continually learn and seek ways to improve our data flows.
- Team Engagement: Collaborate well in a team environment (we use agile). Be creative and cooperative in designing and building data pipelines.
- Bachelor’s Degree in a technical field or equivalent work experience
- Experience building and maintaining ETL pipelines
- 3-5 years of experience with Python, SQL, AWS (S3, EC2)
- Data visualization or reporting experience
- Good verbal and written communication skills
- Cloud based infrastructure and applications (we use AWS and maintain our own Kubernetes clusters)
- ETL tools
- Building or improving APIs
- Collecting data from a variety of sources (APIs, Postgres, Oracle, SAP, Salesforce, messaging architectures, etc)
- Thoughtfully storing data in databases (especially Postgres) to support data products and reporting
- Big data pipelines using Spark or streaming tools
- Working with geospatial data
- Additional coding languages, especially JVM languages or Bash
- Any of these or similar tools (Kubernetes, Ansible, Jenkins, PostgreSQL, Tableau, Mapbox, Airflow)
- Additional AWS tools (EFS, Lambda, RDS/Aurora, Glue, Athena)