W2 - (4) - Sr Data Engineer (Google Cloud Platform tech stack, Python, SQL, Data pipelines, Google Cloud Platform certification) - Remote

Jobs via Dice

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Tanson Corp, is seeking the following. Apply via Dice today!

Duties

Scope: Develops and deploys data pipelines, integrations and transformations to support analytics and machine learning applications and solutions as part of an assigned product team using various open-source programming languages and vended software to meet the desired design functionality for products and programs.

The position requires maintaining an understanding of the organization's current solutions, coding languages, tools, and regularly requires the application of independent judgment. May provide consultative services to departments/divisions and leadership committees. Demonstrated experience in designing, building, and installing data systems and how they are applied to the Department of Data & Analytics technology framework is required. Candidate will partner with product owners and Analytics and Machine Learning delivery teams to identify and retrieve data, conduct exploratory analysis, pipeline and transform data to help identify and visualize trends, build and validate analytical models, and translate qualitative and quantitative assessments into actionable insights.

Requirements

  • Required hands-on Google Cloud Platform tech stack experience, including advanced skills in several of the following: Terraform, Google Cloud Platform Dataflow, Google Cloud Platform Big Query, Google Cloud Platform DataProc, Google Cloud Platform Data Fusion, Google Cloud Platform Change Data Stream, Python/SQL, Cloud Functions, Cloud Events, Cloud Composer.
  • Google Cloud Platform certification

100% Remote - equipment will be provided.

Implement data pipelines using best practices for ETL / ELT, data management, and data governance. Analyze and process complex data sources in a fast-paced environment. Perform data modeling against large data sets for peak efficiency. Identify, design, and implement process improvement solutions that automate manual processes and leverage standard frameworks and methodologies. Understand and incorporate data quality principles that ensure optimal reliability, impact, and user experience. Partner across teams to support cross-platform operations. Create and document functional and technical specifications. Drive exploration of new features, versions, and related technologies, and provide recommendations to enhance our offerings. Mentor junior engineers within the team

Education: Bachelor's degree in Computer Science, Information Technology or related field; OR equivalent 5+ years of experience. 5+ years of hands-on experience programming in SQL. 3+ years of experience building and maintaining automated data pipelines and data assets using batch and/or streaming processes

Schedule Notes: Scope: Develops and deploys data pipelines, integrations and transformations to support analytics and machine learning applications and solutions as part of an assigned product team using various open-source programming languages and vended software to meet the desired design functionality for products and programs. The position requires maintaining an understanding of the organization's current solutions, coding languages, tools, and regularly requires the application of independent judgment. May provide consultative services to departments/divisions and leadership committees. Demonstrated experience in designing, building, and installing data systems and how they are applied to the Department of Data & Analytics technology framework is required. Candidate will partner with product owners and Analytics and Machine Learning delivery teams to identify and retrieve data, conduct exploratory analysis, pipeline and transform data to help identify and visualize trends, build and validate analytical models, and translate qualitative and quantitative assessments into actionable insights. Requirements: -Required hands-on Google Cloud Platform tech stack experience, including advanced skills in several of the following: Terraform, Google Cloud Platform Dataflow, Google Cloud Platform Big Query, Google Cloud Platform DataProc, Google Cloud Platform Data Fusion, Google Cloud Platform Change Data Stream, Python/SQL, Cloud Functions, Cloud Events, Cloud Composer. -Google Cloud Platform certification 100% Remote - equipment will be provided.

Hours Per Day: 8.00

Hours Per Week 40.00

Pay rate: $/hr on W2.

Qualifications

  • Demonstrated experience in designing, building, and installing data systems and how they are applied to the Department of Data & Analytics technology framework is required
  • Required hands-on Google Cloud Platform tech stack experience, including advanced skills in several of the following: Terraform, Google Cloud Platform Dataflow, Google Cloud Platform Big Query, Google Cloud Platform DataProc, Google Cloud Platform Data Fusion, Google Cloud Platform Change Data Stream, Python/SQL, Cloud Functions, Cloud Events, Cloud Composer
  • Google Cloud Platform certification
  • Education: Bachelor's degree in Computer Science, Information Technology or related field; OR equivalent 5+ years of experience
  • 5+ years of hands-on experience programming in SQL
  • 3+ years of experience building and maintaining automated data pipelines and data assets using batch and/or streaming processes
  • Demonstrated experience in designing, building, and installing data systems and how they are applied to the Department of Data & Analytics technology framework is required
  • Candidate will partner with product owners and Analytics and Machine Learning delivery teams to identify and retrieve data, conduct exploratory analysis, pipeline and transform data to help identify and visualize trends, build and validate analytical models, and translate qualitative and quantitative assessments into actionable insights
  • Requirements: -Required hands-on Google Cloud Platform tech stack experience, including advanced skills in several of the following: Terraform, Google Cloud Platform Dataflow, Google Cloud Platform Big Query, Google Cloud Platform DataProc, Google Cloud Platform Data Fusion, Google Cloud Platform Change Data Stream, Python/SQL, Cloud Functions, Cloud Events, Cloud Composer
  • Google Cloud Platform certification 100% Remote - equipment will be provided

Benefits

  • Hours Per Day: 8.00
  • Hours Per Week 40.00
  • Pay rate: $/hr on W2

Responsibilities

  • Scope: Develops and deploys data pipelines, integrations and transformations to support analytics and machine learning applications and solutions as part of an assigned product team using various open-source programming languages and vended software to meet the desired design functionality for products and programs
  • The position requires maintaining an understanding of the organization's current solutions, coding languages, tools, and regularly requires the application of independent judgment
  • May provide consultative services to departments/divisions and leadership committees
  • Candidate will partner with product owners and Analytics and Machine Learning delivery teams to identify and retrieve data, conduct exploratory analysis, pipeline and transform data to help identify and visualize trends, build and validate analytical models, and translate qualitative and quantitative assessments into actionable insights
  • 100% Remote - equipment will be provided
  • Implement data pipelines using best practices for ETL / ELT, data management, and data governance
  • Analyze and process complex data sources in a fast-paced environment
  • Perform data modeling against large data sets for peak efficiency
  • Identify, design, and implement process improvement solutions that automate manual processes and leverage standard frameworks and methodologies
  • Understand and incorporate data quality principles that ensure optimal reliability, impact, and user experience
  • Partner across teams to support cross-platform operations
  • Create and document functional and technical specifications
  • Drive exploration of new features, versions, and related technologies, and provide recommendations to enhance our offerings
  • Mentor junior engineers within the team
  • The position requires maintaining an understanding of the organization's current solutions, coding languages, tools, and regularly requires the application of independent judgment
  • May provide consultative services to departments/divisions and leadership committees

Job Alerts

Get notified when new positions matching your interests become available at Kardow.com.

Need Help?

Questions about our hiring process or want to learn more about working with us?