Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8 - 13 years
16 - 22 Lacs
Chandigarh
Work from Office
- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large-scale data warehouse/data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.
Posted 2 months ago
3 - 6 years
8 - 14 Lacs
Mumbai
Work from Office
- Ensure security standards are followed for all structured and unstructured data platforms (e.g., Azure/AWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregation/segmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security events/incidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on Azure/AWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.
Posted 2 months ago
3 - 6 years
8 - 14 Lacs
Amritsar
Work from Office
- Ensure security standards are followed for all structured and unstructured data platforms (e.g., Azure/AWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregation/segmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security events/incidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on Azure/AWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.
Posted 2 months ago
4 - 9 years
7 - 11 Lacs
Allahabad
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure SynapseAzure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHubIOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill
Posted 2 months ago
4 - 9 years
8 - 14 Lacs
Srinagar
Work from Office
- Strong Power BI experience - Strong Kusto, Azure Data Lake experience - SQL Server - Scripting Language : Scope / U-SQL - Should have experience in DAX - Excellent visualization and formatting skills - Good Experience in Data Analysis and Analytics - Data Modeling Experience - Excellent visualization and formatting skills - Should have strong knowledge in T-SQL - Experience in wring complex queries - ETL Awareness - Good to have Kusto Query Language - Good to have knowledge in Azure Storage - Excellent Communication and Presentation skills
Posted 2 months ago
4 - 9 years
8 - 14 Lacs
Rajkot
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.- Designing and implementing data engineering, ingestion, and transformation functions- Azure Synapse or Azure SQL data warehouse- Spark on Azure is available in HD insights and data bricks- Good customer communication.- Good Analytical skill
Posted 2 months ago
8 - 13 years
16 - 22 Lacs
Rajkot
Work from Office
Ensure security standards are followed for all structured and unstructured data platforms (e.g., Azure/AWS Blob Storage, data lakes, data warehouses, etc.)Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team Identify and drive implementation of database protection tools to detect and prevent unauthorized access to data platforms. Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregation/segmentation. Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. Provide security expertise and consulting to data solution and delivery teams. Integrate security testing and controls into different phases of Data Delivery development lifecycle
Posted 2 months ago
8 - 13 years
16 - 22 Lacs
Allahabad
Work from Office
- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large-scale data warehouse/data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.
Posted 2 months ago
3 - 5 years
8 - 14 Lacs
Ahmedabad
Hybrid
Role : Snowflake Engineer Job Brief :The successful applicant will be working within a highly specialized and growing team to enable delivery of data and advanced analytics system capability. Experience: Total Expirence : 5 yrs. Relevant Experience : 2 yrs. Position Open : 1 Job Location : Ahmedabad Salary : Not a constraint for the right candidate Roles and Responsibility : 1. Database and Datawarehouse Expertise : - Demonstrate excellent understanding of Database and Datawarehouse concepts. - Strong proficiency in writing SQL queries. 2. Snowflake Cloud Data Warehouse : - Design and implement Snowflake cloud data warehouse. - Develop and implement cloud-related architecture and data modeling. 3. Migration Projects : - Manage migration projects, specifically migrating from On-prem to Snowflake. 4. Snowflake Capabilities : - Utilize comprehensive knowledge of Snowflake capabilities such as Snow pipe, Stages, SnowSQL, Streams, and tasks. 5. Advanced Snowflake Concepts : - Implement advanced Snowflake concepts like setting up resource monitor, RBAC controls, Virtual Warehouse sizing, Zero copy clone. 6. Data Migration Expertise : - Possess in-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouse. 7. Snowflake Feature Deployment : - Deploy Snowflake features such as data sharing, event, and lake house patterns. 8. Incremental Extraction Loads : - Execute Incremental extraction loads, both batched and streaming. Skill Requirement : - Excellent understanding of Database and Datawarehouse concepts.- Strong proficiency in SQL query writing. - Experience in migration projects, specifically migrating from On-prem to Snowflake. - Comprehensive knowledge of Snowflake capabilities (Snow pipe, Stages, SnowSQL, Streams, and tasks). - Implementation of advanced Snowflake concepts (resource monitor, RBAC controls, Virtual Warehouse sizing, Zero copy clone). - In-depth experience in data migration from RDBMS to Snowflake cloud data warehouse. - Deployment of Snowflake features such as data sharing, event, and lake house patterns. - Proficient in Incremental extraction loads batched and streaming. Technical skills additional : - Good to have experience with Snowpark. - Snowflake Certification. Qualification : Bachelor's or Master's degree in Computer Science or related field. Office Timings : 10.00 AM to 7.00 PM Perks and Benefits :- 5-Days a week- Flexible working hours- Annual company trip- Medical Insurance- A hybrid working model- Skill refining programs- Bonuses and pay raise
Posted 2 months ago
3 - 5 years
40 - 45 Lacs
Bhubaneshwar, Kochi, Kolkata
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 2 months ago
6 - 10 years
8 - 13 Lacs
Bengaluru
Work from Office
Mandatory Skills : Data engineer , AWS Athena, AWS Glue,Redshift,Datalake,Lakehouse,Python,SQL Server Must Have Experience: 6+ years of hands-on data engineering experience Expertise with AWS services: S3, Redshift, EMR, Glue, Kinesis, DynamoDB Building batch and real-time data pipelines Python, SQL coding for data processing and analysis Data modeling experience using Cloud based data platforms like Redshift, Snowflake, Databricks Design and Develop ETL frameworks Nice-to-Have Experience: ETL development using tools like Informatica, Talend, Fivetran Creating reusable data sources and dashboards for self-service analytics Experience using Databricks for Spark workloads or Snowflake Working knowledge of Big Data Processing CI/CD setup Infrastructure-as-code implementation Any one of the AWS Professional Certification
Posted 3 months ago
3 - 7 years
8 - 14 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
About Emperen Technologies : Emperen Technologies is a leading consulting firm committed to delivering tangible results for clients through a relationship-driven approach. With successful implementations for Fortune 500 companies, non-profits, and startups, Emperen Technologies exemplifies a client-centric model that prioritizes values and scalable, flexible solutions. Emperen specializes in navigating complex technological landscapes, empowering clients to achieve growth and success. Role Description : Emperen Technologies is seeking a highly skilled Senior Master Data Management (MDM) Engineer to join our team on a contract basis. This is a remote position where the Senior MDM Engineer will be responsible for a variety of key tasks including data engineering, data modeling, ETL processes, data warehousing, and data analytics. The role demands a strong understanding of MDM platforms, cloud technologies, and data integration, as well as the ability to work collaboratively in a dynamic environment. Key Responsibilities : - Design, implement, and manage Master Data Management (MDM) solutions to ensure data consistency and accuracy across the organization. - Oversee the architecture and operation of data modeling, ETL processes, and data warehousing. - Develop and execute data quality strategies to maintain high-quality data in line with business needs. - Build and integrate data pipelines using Microsoft Azure, DevOps, and GitLab technologies. - Implement data governance policies and ensure compliance with data security and privacy regulations. - Collaborate with cross-functional teams to define and execute business and technical requirements. - Analyze data to support business intelligence and decision-making processes. - Provide ongoing support for data integration, ensuring smooth operation and optimal performance. - Troubleshoot and resolve technical issues related to MDM, data integration, and related processes. - Work on continuous improvements of the MDM platform and related data processes. Qualifications : Required Skills & Experience : - Proven experience in Master Data Management (MDM), with hands-on experience on platforms like Profisee MDM and Microsoft Master Data Services (MDS). - Solid experience in Microsoft Azure cloud technologies. - Expertise in DevOps processes and using GitLab for version control and deployment. - Strong background in Data Warehousing, Azure Data Lakes, and Business Intelligence (BI) tools. - Expertise in Data Governance, data architecture, data modeling, and data integration (particularly using REST APIs). - Knowledge and experience in data quality, data security, and privacy best practices. - Experience working with business stakeholders and technical teams to analyze business requirements and translate them into effective data solutions. - Basic Business Analysis Skills - Ability to assess business needs, translate those into technical requirements, and ensure alignment between data management systems and business goals. Preferred Qualifications : - Experience with big data technologies and advanced analytics platforms. - Familiarity with data integration tools such as Talend or Informatica. - Knowledge of data visualization tools such as Power BI or Tableau. - Certifications in relevant MDM, cloud technologies, or data management platforms are a plus Location- Remote, Delhi NCR, Bangalore, Chennai, Pune, Kolkata, Ahmedabad, Mumbai, Hyderabad
Posted 3 months ago
0 - 5 years
4 - 9 Lacs
Bengaluru
Work from Office
Strong experience in Python programming language - Must. Experience in REST APIs, Fast APIs, Graph APIs, SQL Alchemy. Good experience in Azure Services like DataLake, Azure SQL, Function App, Azure Cognitive Search. Good Understanding on Chunking, Embeddings, vectorization, indexing, Prompting, Hallucinations, RAG Hands on experience in DevOps, create pull PRs and maintain code repositories. Strong communication skills and ability to collaborate effectively with team members. Different GenAI frameworks.. security and governance
Posted 3 months ago
5 - 10 years
27 - 30 Lacs
Bengaluru, Gurgaon, Hyderabad
Work from Office
Key Responsibilities and Accountabilities Java 1.8 Sprintboot+JPA Spring quarts Microservices Architecture Database SCM DevOps Expertise in SQL Advanced APIs concepts Pagination Strong understanding of Microsoft/Azure - APIs Collection. Understanding in BPMN(Business Process Model and Notation)-is plus Understanding on Flowable library Understanding of Datamart/Datalake Understanding of ETL Location- Gurugram/Hyderabad/Bangalore/Chennai
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2