Azure Data Warehouse Developer General Labor - Downey, CA at Geebo

Azure Data Warehouse Developer

Azure Data Warehouse DeveloperDowney, CA (100% Remote )12
Months Contract Skills Required:
Strong understanding of relational database concepts, SQL (Structured Query Language), and data modelingKnowledge of various databases used in data warehousing, such as Oracle, SQL Server, PostgreSQL, or MySQL Proficiency in ETL tools like Azure Data Factory, Azure Synapse, or Microsoft SSIS (SQL Server Integration Services) to extract data from various sources, transform it to fit the target schema, and load it into the data warehouse Ability to design and implement data warehouse data models, including star schema, snowflake schema, and dimension hierarchies for optimized data retrieval and analysis Understanding of data integration techniques and data quality processes to ensure data accuracy, consistency, and reliability in the data warehouse Knowledge of data warehouse architecture principles, such as data staging areas, data marts, data lakes, and the overall data flow Familiarity with data warehouse development methodologies and the ability to apply best practices in building scalable and maintainable data warehouses Proficiency in scripting languages like Python, Perl, or Shell scripting for automating ETL processes and data manipulation Understanding of data security principles and compliance regulations to protect sensitive information in the data warehouse Skills in optimizing data warehouse performance, including query optimization, index creation, and partitioning Experience Required:
3 years of experience with Oracle database architecture, data Modeling, normalization, and performance optimization 3 years of experience with mainframe databases like IBM DB2 and IMS (Information Management System), including data Modeling, database design, and SQL querying 3 years of experience with Microsoft Azure Cloud platform, including familiarity with other Azure services like Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Azure DevOps 3 years of experience with designing and developing data warehouses using other platforms like Microsoft SQL Server, Oracle, or Teradata 3 years of experience with years of experience in big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka 3 years of experience with data visualization tools like Power BI or Cognos to create insightful visualizations and reports based on data stored in Synapse Data Warehouse 3 years of experience in working with data cleansing, data profiling, and data validation techniques to ensure high data integrity in the data warehouse.
Education Required:
This classification requires the possession of a bachelor's degree in an IT-related or Engineering fieldAdditional qualifying experience may be substituted for the required education on a year-for-year basis.
Recommended Skills Apache Hadoop Apache Kafka Apache Spark Architecture Azure Data Factory Big Data Apply to this job.
Think you're the perfect candidate? Apply Now Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.