The Data Engineer is responsible for aggregating data from various systems into a central data lake and warehouse to be used for statistical analysis, reporting, and dashboards. Also build data pipelines to transport/exchange data between multiple systems using Kafka.
This position will utilize modern cloud data warehouse technology and works with both unstructured and structured data, and Confluent Kafka to exchange data back and forth between systems, lakes, and applications. This is not a traditional data warehouse position and will not involve traditional schema-based relational database design.
The successful candidate will understand cloud data warehouse technology, and Kafka, as well as how to organize and store data for use with advanced analytics.
KEY RESPONSIBILITIES
Develop, Maintain, support, and enhance the business intelligence data backend, including data warehouses and data lakes.
Performs needed assessments
Implement data transformations and data structures for data warehouse and lake/repository.
Process data using spark (PySpark)
Manage cloud and/or on-premises solutions for data transfer and storage.
Establish data structures for all enterprise data stored in business intelligence systems.
Collaborate and work with data analysts in various functions to ensure that data meets their reporting and analysis needs.
Establish interfaces between the data warehouse and reporting tools, such as PowerBI.
Assist data analysts with connecting reporting and analytics software to data warehouses, lakes, and other data sources.
Develop, create, maintain, and enhance the data pipelines using kafka over a variety of systems.
Manage access and permissions to data.
REQUIRED SKILLS
Bachelor’s Degree plus at least 5-7 years of experience with minimum 2+years in Data lakes, managing data warehouse and/or business intelligence systems. An advanced degree or certifications in a related field is a plus.
Knowledge, Skills & Abilities:
Demonstrated experience with setting up data structures (tables and views) for use with modern analytics software.
Expertise with Amazon Web Services, kafka, Spark programming, SQL-based database systems, and/or other enterprise data warehouse solutions.
Experience working with programming languages used in ETL and/or ELT environments, such as SQL and Python.
Experience working with DevOPS tools like Jenkins, Ansible, Terraform, GitHUB.
Eperience / exposure to Java is a plus.
Full Job Description About Accenture: Accenture is a global professional services company with leading capabilities in digital, cloud and security....
Apply For This JobCompany Description About Verisk At the heart of what we do is help clients manage risk. Verisk (Nasdaq: VRSK)...
Apply For This Jobbr{display:none;}.css-58vpdc ul > li{margin-left:0;}.css-58vpdc li{padding:0;}]]> The English Faculty must be competent in teaching students of ICSE and CBSE Schools. You...
Apply For This Jobbr{display:none;}.css-58vpdc ul > li{margin-left:0;}.css-58vpdc li{padding:0;}]]> The candidate will have to coordinate with the drivers and maintain the deadlines. Job Types:...
Apply For This Jobbr{display:none;}.css-58vpdc ul > li{margin-left:0;}.css-58vpdc li{padding:0;}]]> Job Title: AR Follow up Analyst Job Category: AR Follow up Job Type: Full Time...
Apply For This JobWelcome to the Latest Job Vacancies Site 2021 and at this time we would like to inform you of the...
Apply For This Job