Purpose and Scope:
Responsible for the delivery of enterprise data warehouse, data lake and cloud analytical platform solution.
Principal Duties and Responsibilities:
Designing enterprise warehouse, cloud data platform and data management initiatives.
- Design and architect enterprise data warehouse following the data-vault and dimensional data modelling methodology.
- Design the architecture to perform data management activities i.e., data extraction, transformation and load (ETL) to enterprise data warehouse. Implement the data management solutions using SQL, SQL procedures, Unix, Python, data management tools like Informatica power center SQL on Oracle exadata database.
Designing, implementing, maintaining and supporting data stores used for reporting and analytics (such as data lake, data warehouse systems and related data marts).
- Design the overall architecture to Extract, Transform and Load (ETL) data into data warehouse or data mart and data lake
- Develop code using SQL, SQL procedures, UNIX, python, ETL tools like Informatica Power center, AWS services like S3, data pipeline, AWS Glue, Python, AWS Lambda and Athena, Azure services like Data Lake, Data factory and Databricks to Load data into Data warehouse/DataMart/Data Lake.
Coordinating with peers from other clinical and IT groups to ensure successful design, planning and completion of programs and projects.
- Understand the Upstream and downstream application process as part of requirement gathering working with peers from other clinical and IT groups.
Understanding business requirements and priorities to shape the scope and requirements of the Data Warehouse solution.
- Gather and understand business requirements and translate business requirements to Functional and technical requirements for data warehouse solution.
Performing data analysis and data provisioning against source systems.
- Perform analysis against source data by running complex SQLs and develop ETL code for getting more insights on source data as part of requirement gathering.
- Perform data analysis on source data to investigate and resolve production issues reported by end users.
Designing the processes to acquire and integrate disparate data from multiple sources.
- Design and develop ETL process using SQL and Informatica to load data into data warehouse on daily basis from various source application on SQL Server, Oracle, Cloud Sources and files
- Design and develop the process in AWS or Microsoft Azure to integrate real-time IOT data and On Premise data into Data Lake.
Assisting in the data architecture design to be used for Business Intelligence and Analytical purposes.
- Assist downstream users with data warehouse related requirements & architecture Questions and provide SQL’s and develop ETL code to read data from data warehouse for business intelligence and Analytical needs.
Documenting technical specifications for the design of data integration and data architecture.
- Document functional and technical requirements in technical specification for developing code using SQL, SQL procedure, Unix, Python and ETL tools like Informatica Power center.
Performing unit testing to assure quality repeatable, quality processes with accurate data results and Installing processes for auditing target data stores and ensuring data quality.
Providing input and experience on the architecture, design and evolution of FMCNA enterprise warehouse and information delivery solutions, including tool selection, migration strategies, and risk mitigation.
Education Experience and Required Skills:
Position requires a Bachelor’s degree (or an equivalent foreign degree) in Computer Science, Electrical Engineering, IT or a closely related field and 8 years of experience as a Data Warehouse Developer. Must also have 5 years of experience (which can have been gained concurrently with the other experience requirements) working with Informatica Power Center, SQL Server, SQL and PLSQL, Dimensional and Data Vault Modeling, ETL design, Python and Unix Scripting; and 2 years of experience (which can have been gained concurrently with the other experience requirements) working with AWS services including S3, data pipeline, AWS Glue, AWS Lambda, EMR and Athena.
You do not have any recently viewed jobs
You do not have any saved jobs