Data Engineer–Intermediate – Information Management
At ICBC, it’s our job to make sure the car insurance system works for all British Columbians, today and in the future. If you want to make the most of your skills and expertise while growing your career, we want you. A career at ICBC is an opportunity to be part of a talented, diverse and inclusive team that is driven to serve its customers and community. You can expect a competitive salary, comprehensive benefits and a collaborative work environment. If you are reliable and dependable, contact us today to be part of our talented and diverse team as we work together to create an insurance system we can all be proud of.
We welcome applications from all qualified job seekers. If you are a job seeker with a disability, please let us know as adjustments can be made to help support you in delivering your best performance.
The Intermediate Data Engineer on the Information Management team will focus on development of big data and analytics solutions, working closely with stakeholders to meet their decision-support requirements. This role will work with the Information Management team to support data analysis and equip the business to make data-driven decisions. We use Scala, Spark and SQL to perform “Data Preparation” and Tableau to create self-service dashboards. The team uses the latest in Big Data technologies such as StreamSets, Hadoop and Apache Spark in-memory framework.
As the Intermediate Data Engineer, you will be responsible for:
• Collaborating with customers across the organization
• Lead junior developers and act as an Architecture Owner(AO) for IM teams
• Creating mapping documentation of data elements from source to target
• Developing & testing Data Transformation pipelines by leveraging the latest Big Data tools and technologies
• Provide subject matter expertise on data sources, reporting workflows, business process, and appropriate tools to analyze data
• Participate with corporate data user teams, develop data validation and test plans, performing user acceptance testing, and provide feedback to development and sustainment teams
• Conducting analysis for moderate to complex data requests, defining data fields and determining data availability, developing information layout, format and interactivity. Presenting findings and providing clarification.
To make an immediate contribution, you will bring the following:
• Experience leading teams and acting as a technical focal point.
• Advanced experience coding in at least one of the following Object Oriented Programming languages: Scala, Java, Python or C++
• Hands on experience working with Big Data platforms, ideally with exposure to Hadoop ecosystem (HDFS, Apache Hive, Apache Spark, Apache Drill, SparkSQL)
• Experience in designing efficient and robust ETL pipelines
• Experience with large and complex structured and unstructured datasets
• Intermediate experience with SQL Queries & Relational Databases
• Knowledge of Linux/Unix operating systems
• Strong data quality management process understanding, data analysis and data profiling
• Ability to apply critical thinking skills to troubleshoot and perform root cause analysis on technical problems and solution design
• Design, develop and enforce best practices and standards around data engineering
• Understanding of Agile Methodologies
• Experience with reporting and visualization tools, such as Business Objects or Tableau would be an asset.
Please note only those legally entitled to work in Canada at present will be considered for this position.