Data Engineer

0012008

About CCC

CCC Intelligent Solutions is a leading technology company helping to improve the insurance claims process for millions of people. Our award-winning SaaS platform connects more than 30,000 businesses, including insurance carriers, repair facilities, automakers, part suppliers, lenders, and others to streamline the process from start to finish.

Our advanced capabilities in AI, IoT, telematics, blockchain, data, and analytics drive continual innovation across our platform, as we work to advance the multi-trillion-dollar P&C insurance economy’s digital transformation.

At CCC, our mission is to keep people’s lives moving forward when it matters most. Diversity of experience and perspective are key to our pursuit so we can deliver a future of possibilities for our customers. Find out more about CCC Intelligent Solutions by visiting cccis.com.

.

Job Description Summary

The Enterprise Analytics team at CCC has an open position for a Data Engineer. The team builds platforms to provide insights to internal and external clients of CCC businesses in auto property damage and repair, medical claims, and telematics data. Our solutions include analytical applications against claim processing, workflow productivity, financial performance, client and consumer satisfaction, and industry benchmarks. Our data engineers use big data technology to create best-in-industry analytics capability. This position is an opportunity to learn and use AWS Cloud and Spark ecosystem tools for micro-batch and streaming analytics. The Data Engineer will work closely with product owners, information engineers, data scientists, data modelers, and infrastructure and data governance positions. We look for engineers who start with 2-3 years of experience in big data and who also love to learn new tools and techniques in an endlessly changing big data landscape.

Job Duties

  • Architect, build and maintain BigData Infrastructure and Data flow pipeline. This can include instrumenting AWS artifacts, locating and analyzing data, creating data flows, defining and building data cleansing, mapping to a standard data model, transforming to satisfy business rules and statistical computations, and validating data content.
  • Build and maintain systems that monitor all aspects of the application and the infrastructure.
  • Participate in architecture and design discussions, implement Proof of concepts, and report results to stakeholders.  

Qualifications

  • Master’s or bachelor’s degree with Engineering/Programming/Analytics Specialization
  • 2-3 years experience with building, maintaining, and supporting complex data flows with structural and un-structural data 
  • Proficiency in infrastructure-as-code, AWS Cloud, Terraform, and Ansible. 
  • Familiarity with Python, PySpark, Apache Kafka, and Apache Airflow
  • Experience in Unix commands and scripting
  • Ability to debug and fix the issues and prepare the RCA

Preferred Qualifications:

  • Understanding of Data Life Cycle Management. 
  • Experience and knowledge of Jira/git/ Continuous Integration and Continuous Delivery (CI/CD)
  • Experience in monitoring tools like AWS Cloudwatch, Prometheus, Nagios, and Grafana.