Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer Location: Pune, India (Hybrid) Type: Contract (6 Months) Experience: 5–8 Years Domain: Financial Services Work Timing: Regular Day Shift Background Check: Mandatory before onboarding Job Description: Seeking experienced Data Engineers with strong hands-on skills in SQL, Python, Azure Databricks, ADF, and PySpark. Candidates should have experience in data modeling, ETL design, big data technologies, and large-scale on-prem to cloud migrations using Azure data stack. Mandatory Skills: Azure Databricks Azure Data Factory Python PySpark Preferred Skills: Spark, Kafka Azure Synapse, Azure SQL, Azure Data Lake, Azure Cosmos DB Batch and real-time ingestion
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Exp level – 5 to 10 yrs Strong in Azure and ADF, Azure Data brick or Fabrics & Data pipeline Relevant 3 yrs of experience in Azure Data Brick.
Posted 1 month ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. About Experian Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Experience And Skills Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 1 month ago
8.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
On-site
Job Description We are seeking a skilled and motivated Data Engineer with 8+ years of experience to join our team. The ideal candidate should be experienced in data engineering on Snowflake, Azure -ADF, Microsoft MDS, SQL ,Data Pipelines with a focus on developing, maintaining the Data Analytics solutions. You will collaborate with cross-functional teams to deliver high-quality data solutions that meet business requirements. Required Skills And Experience Bachelor or Master degree in computer science, Data Science, Engineering, or a related field. 8+ years of experience in data engineering or related fields. Strong proficiency in SQL, Snowflake, Stored procedure, Views . Hands-on experience with Snowflake SQL, ADF (Azure Data Factory), Microsoft MDS(Master Data Service). Knowledge of data warehousing concepts. Experience with cloud platforms (Azure). Understanding of data modeling and data warehousing principles. Strong problem-solving and analytical skills, with attention to detail. Excellent communication and collaboration skills. Bonus Skills Exposure to CI/CD practices using Microsoft Azure DevOps . Basic knowledge or understanding of PBI. Key Responsibilities Design, develop, and maintain scalable and efficient data pipelines using Azure Data Factory (ADF). Build and optimize data models and data warehousing solutions within Snowflake. Develop and maintain data integration processes, ensuring data quality and integrity. Utilize strong SQL skills to query, transform, and analyze data within Snowflake. Develop and manage stored procedures and views in Snowflake. Implement and manage master data using Microsoft Master Data Services (MDS). Collaborate with data analysts and business stakeholders to understand data requirements and deliver effective data solutions. Ensure the performance, reliability, and security of data pipelines and data warehousing systems. Troubleshoot and resolve data-related issues in a timely manner. Stay up-to-date with the latest advancements in data engineering technologies, particularly within the Snowflake and Azure ecosystems. Contribute to the documentation of data pipelines, data models, and ETL processes (ref:hirist.tech)
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering Responsibilities : Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL Skills : ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data Skills : API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling) (ref:hirist.tech)
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Databricks Data Engineer Key Responsibilities : Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver Skills & Experience : 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice To Have Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. (ref:hirist.tech)
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Years of Experience : 5 Job Description We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills : ADF,Databricks Secondary Skills : Responsibility : Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages. (ref:hirist.tech)
Posted 1 month ago
7.0 years
0 Lacs
Greater Kolkata Area
Remote
Job Title : Senior Data Engineer Azure, ETL, Snowflake. Experience : 7+ yrs. Location : Remote. Job Summary We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in ETL processes, Cloud data platforms (Azure), Snowflake, SQL, and Python scripting. The ideal candidate will have hands-on experience building robust data pipelines, performing data ingestion from multiple sources, and working with modern data tools like ADF, Databricks, Fivetran, and DBT. Key Responsibilities Develop and maintain end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake. Write optimized SQL queries, stored procedures, and views to transform and retrieve data. Perform data ingestion and integration from various formats including JSON, XML, Parquet, TXT, XLSX, etc. Work on data mapping, modelling, and transformation tasks across multiple data sources. Build and deploy custom connectors using Python, PySpark, or ADF. Implement and manage Snowflake as a data storage and processing solution. Collaborate with cross-functional teams to ensure code promotion and versioning using GitHub. Ensure smooth cloud migration and data pipeline deployment using Azure services. Work with Fivetran and DBT for ingestion and transformation as required. Participate in Agile/Scrum ceremonies and follow DevSecOps practices. Mandatory Skills & Qualifications 7 years of experience in Data Engineering, ETL development, or similar roles. Proficient in SQL with strong understanding of joins, filters, and aggregations. Solid programming skills in Python (Functions, Loops, API requests, JSON parsing, etc. Strong experience with ETL tools such as Informatica, Talend, Teradata, or DataStage. Experience with Azure Cloud Services, specifically : Azure Data Factory (ADF). Databricks. Azure Data Lake. Hands-on experience in Snowflake implementation (ETL or Storage Layer). Familiarity with data modelling, data mapping, and pipeline creation. Experience working with semi-structured/unstructured data formats. Working knowledge of GitHub for version control and code management. Good To Have / Preferred Skills Experience using Fivetran and DBT for ingestion and transformation. Knowledge of AWS or GCP cloud environments. Familiarity with DevSecOps processes and CI/CD pipelines within Azure. Proficiency in Excel and Macros. Exposure to Agile methodologies (Scrum/Kanban). Understanding of custom connector creation using PySpark or ADF. Soft Skills Strong analytical and problem-solving skills. Effective communication and teamwork abilities. Ability to work independently and take ownership of deliverables. Detail-oriented with a commitment to quality. Why Join Us? Work on modern, cloud-based data platforms. Exposure to a diverse tech stack and new-age data tools. Flexible remote working opportunity aligned with a global team. Opportunity to work on critical enterprise-level data solutions. (ref:hirist.tech)
Posted 1 month ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description 7+ years of Hands experience on with Azure (ADF Pipelines), Databricks, data lake, Python, SQL/Snowflake [Must Have] Hands experience with data Architecture and design techniques (local/abstract) including API techniques/developments [Must Have] Hands experience of real-time data pipelining-based technologies Fast api [Must Have] Experience on delta-lake would be preferable Experience with architecting, designing and developing Big-Data processing pipelines Extensive experience with data platforms; working with large data sets, large amounts of data in motion and numerous big data technologies. Experience with Cloud-based software, specifically Microsoft Azure (including but not limited to Databricks, Azure Functions) Experience of Agile Project Delivery techniques (e.g. Scrum, Kanban). Have good interpersonal, communication, facilitation and presentation skills. Comfortable in communicating with business colleagues and working in cross functional global teams Ability to work under pressure with tight deadlines and balancing multiple priorities. Analytical, troubleshooting and problem-solving skills Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges. Excellent communication and stakeholder management skills (ref:hirist.tech)
Posted 1 month ago
7.0 - 14.0 years
0 Lacs
Greater Kolkata Area
On-site
Key Responsibilities Develop and optimize complex SQL queries, including joins (inner/outer), filters, and aggregations. Work with diverse datasets from multiple database sources, ensuring data quality and integrity. Leverage Python for data manipulation, including functions, iterations, API requests, and JSON flattening. Use Python to interpret, manipulate, and process data to facilitate downstream analysis. Design, implement, and optimize ETL processes and workflows. Manage data ingestion from various formats (e.g., JSON, Parquet, TXT, XLSX) using tools like Informatica, Teradata, DataStage, Talend, and Snowflake. Demonstrate expertise in Azure services, specifically ADF, Databricks, and Azure Data Lake. Create, manage, and optimize cloud-based data pipelines. Integrate data sources via Fivetran or custom connectors (e.g., PySpark, ADF). Lead the implementation of Snowflake as an ETL and storage layer. Ensure seamless data connectivity, including handling semi-structured/unstructured data. Promote code and manage changes across various environments. Proficient in writing complex SQL scripts, including stored procedures, views, and functions. Hands-on experience with Snowflake in multiple projects. Familiarity with DBT for transformation logic and Fivetran for data ingestion. Strong understanding of data modeling and data warehousing fundamentals. Experience with GitHub for version control and code Skills & Experience : 7 to 14 years of experience in Data Engineering, with a focus on SQL, Python, ETL, and cloud technologies. Hands-on experience with Snowflake implementation and data pipeline management. In-depth understanding of Azure cloud tools and services, such as ADF, Databricks, and Azure Data Lake. Expertise in designing and managing ETL workflows, data mapping, and ingestion from multiple data sources/formats. Proficient in Python for data interpretation, manipulation, and automation tasks. Strong knowledge of SQL, including advanced techniques such as stored procedures and functions. Experience with GitHub for version control and collaborative to Have : Experience with other cloud platforms (e.g., AWS, GCP). Familiarity with DataOps and continuous integration/continuous delivery (CI/CD) practices. Prior experience leading or mentoring teams of data engineers. (ref:hirist.tech)
Posted 1 month ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Responsibilities : Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments (Azure). Extensive Experience with common Azure services such as ADLS, Synapse, Databricks, Azure SQL etc. Experience on Azure services such as ADF, Polybase, Azure Stream Analytics Proven expertise in Databricks architecture, Delta Lake, Delta sharing, Unity Catalog, Data pipelines, and Spark tuning. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. In-depth experience with SQL, Python, and/or PySpark. Hands-on knowledge of data governance, lineage, and cataloging tools such as Azure Purview and Unity Catalog. Experience in implementing CI/CD pipelines for data and BI components (e.g., using DevOps or GitHub). Experience on building symantec modeling in Power BI. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. Strong expertise in data exploration using SQL and a deep understanding of data relationships. Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modeling using any Modeling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. (ref:hirist.tech)
Posted 1 month ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. Qualifications Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Gurugram
Work from Office
Job Description About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Azure Data Engineer. Experience: 4+ Years Skill Set: Azure Synapse, Pyspark, ADF and SQL. Location: Pune, Hyderabad, Gurgaon 5+ years of experience in software development, technical operations, and running large-scale applications. 4+ years of experience in developing or supporting Azure Data Factory (API/APIM), Azure Databricks, Azure DevOps, Azure Data Lake storage (ADLS), SQL and Synapse data warehouse, Azure Cosmos DB 2+ years of experience working in Data Engineering Any experience in data virtualization products like Denodo is desirable Azure Data Engineer or Solutions Architect certification is desirable Should have a good understanding of container platforms like Docker and Kubernetes. Should be able to assess the application/platform time to time for architectural improvements and provide inputs to the relevant teams Very Good troubleshooting skills (quick identification of the application issues and providing quick resolutions with no or minimal user/business impact) Hands-on experience in working with high-volume, mission-critical applications Deep appreciation of IT tools, techniques, systems, and solutions. Excellent communication skills along with experience in driving triage calls which involves different technical stake holders Has creative problem-solving skills related to cross-functional issues amidst the changing priorities. Should be flexible and resourceful to swiftly manage the changing operational goals and demands. Good experience in handling escalations and take complete responsibility and ownership of all critical issues to get a technical/logical closure. Good understanding of the IT Infrastructure Library (ITIL) framework and various IT Service Management (ITSM) tools available in the marketplace
Posted 1 month ago
3.0 - 6.0 years
5 - 8 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
About the Role: Were hiring 2 Cloud & Data Engineering Specialists to join our fast-paced, agile team. These roles are focused on designing, developing, and scaling modern, cloud-based data engineering solutions using tools like Azure, AWS, GCP, Databricks, Kafka, PySpark, SQL, Snowflake, and ADF. Position 1: Cloud & Data Engineering Specialist Resource 1 Key Responsibilities: Develop and manage cloud-native solutions on Azure or AWS Build real-time streaming apps with Kafka Engineer services using Java and Python Deploy and manage Kubernetes-based containerized applications Process big data using Databricks Administer SQL Server and Snowflake databases, write advanced SQL Utilize Unix/Linux for system operations Must-Have Skills: Azure or AWS cloud experience Kafka, Java, Python, Kubernetes Databricks, SQL Server, Snowflake Unix/Linux commands Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 1 month ago
7.0 - 12.0 years
6 - 16 Lacs
Bengaluru
Remote
5+ years’ experience with a strong proficiency with SQL query/development skills Hands-on experience with ETL tools Experience working in the healthcare industry with PHI/PII
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
3.0 - 7.0 years
10 - 20 Lacs
Bengaluru
Hybrid
Key Skills: MSBI, Azure, SSIS, SSRS, SQL Server, ETL, ADF Roles and Responsibilities: Develop, schedule, and manage ETL processes using SSIS and Azure Data Factory (ADF) Design and implement OLAP cubes and data models using SSAS and Azure Analysis Services Develop and deploy reports and dashboards using SSRS and Power BI Migrate on-premises workloads to Azure SQL, Data Lake, and Synapse Analytics Build scalable data pipelines and workflows using Databricks (PySpark) Write complex T-SQL queries, stored procedures, functions, and triggers for data operations Create and manage CI/CD pipelines for data deployment using Azure DevOps Collaborate with business stakeholders to translate functional requirements into technical solutions Track, document, and resolve issues using ServiceNow, JIRA, or ALM Maintain comprehensive technical documentation, including Low-Level Design (LLD) and testing artifacts Skills Required: Expertise in MSBI Stack: SSIS, SSRS, SSAS Strong experience with Azure Data Services: ADF, Synapse, Databricks, Cosmos DB, Azure SQL Advanced proficiency in SQL Server (2008R2-2019) Proficient in programming languages such as Python and C# Hands-on experience with CI/CD tools and Azure DevOps Strong understanding of data warehousing, data modeling, and ETL best practices Experience integrating with Azure Synapse, Purview, and Power BI Excellent communication, problem-solving, and stakeholder engagement skills Preferred Certifications: Microsoft Certified: Azure Data Engineer Associate (DP-203) Microsoft Certified: Azure Fundamentals (AZ-900) Microsoft Certified: Data Analyst Associate (Power BI) Microsoft Certified: Azure Solutions Architect Expert (AZ-305) Microsoft Certified: Azure Database Administrator (DP-300) Microsoft Certified: SQL Server Database Development Microsoft Certified: Enterprise Data Analyst (DP-500) Education: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 1 month ago
3.0 - 4.0 years
0 - 1 Lacs
Bengaluru
Remote
Cloud & Data Engineering Specialist Remote Work Contract Duration: 6 months Exp Level: 3 - 4 years (Must be able to work according to JD) Work Timings: (2:30 pm to 11:30 pm IST) We are seeking two highly skilled Cloud & Data Engineering Specialists to join our dynamic team. These roles will focus on designing, building, and optimizing scalable cloud-based solutions, data pipelines, and analytics platforms. The ideal candidates will have strong expertise in cloud platforms, data engineering, and modern technologies, with a focus on delivering robust, secure, and efficient data solutions. Position 1: Cloud & Data Engineering Specialist (Resource 1) Key Responsibilities: Design, develop, and maintain cloud-based solutions on Azure or AWS. Implement and manage real-time data streaming and messaging systems using Kafka. Develop scalable applications and services using Java and Python. Deploy, manage, and monitor containerized applications using Kubernetes. Build and optimize big data processing pipelines using Databricks. Manage and maintain databases, including SQL Server and Snowflake, and write complex SQL scripts. Work with Unix/Linux commands to manage and monitor system operations. Collaborate with cross-functional teams to ensure seamless integration of cloud-based solutions. Key Skills: Expertise in Azure or AWS cloud platforms. Proficiency in Kafka, Java, Python, and Kubernetes. Hands-on experience with Databricks for big data processing. Strong database management skills with SQL Server, Snowflake, and advanced SQL scripting. Solid understanding of Unix/Linux commands. Position 2: Cloud & Data Engineering Specialist (Resource 2) Key Responsibilities: Design and implement cloud solutions across Azure, AWS, and GCP platforms. Develop and optimize data pipelines using PySpark, Python, and SQL. Build and manage ETL workflows using Azure Data Factory (ADF). Work with big data technologies such as Apache Spark and Databricks to process large datasets. Design and deliver dashboards and reports using Tableau and Power BI. Implement DevOps practices, including version control with Git, CI/CD pipelines, and containerization using Docker. Collaborate with stakeholders to gather requirements and deliver scalable data solutions. Key Skills: Proficiency in Azure, AWS, and GCP cloud platforms. Strong programming skills in Python, SQL, and PySpark. Experience with Snowflake and SQL Server databases. Expertise in ETL tools like Azure Data Factory (ADF). Hands-on experience with Apache Spark and Databricks for big data processing. Proficiency in reporting tools such as Tableau and Power BI. Knowledge of DevOps practices, including Git, CI/CD pipelines, and Docker. General Requirements for Both Roles: Bachelors degree in Computer Science, Engineering, or a related field (or equivalent experience). 3+ years of experience in cloud and data engineering roles. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Proven ability to work in a fast-paced, agile environment. Send resume to kalaivanan.balasubramaniam@covalensedigital.com Thanks kalai 8015302990
Posted 1 month ago
0 years
0 Lacs
Yamunanagar, Haryana, India
On-site
We’re Hiring: Sr. Data Analytics & Quality Engineer | Hyderabad (Work from Office) 🚨 Are you an experienced Data Analytics & Quality Engineer with a passion for delivering high-quality, validated data in a fast-paced environment? We’re looking for a strong professional to join our dynamic team in Hyderabad! 💡 Role: Sr. Data Analytics and Quality Engineer 📍 Location: Hyderabad (Work from Office) 🔧 Must-Have Skills: • Programming: SQL (T-SQL or PL/SQL) • Experience: QA process, ETL testing, Data validation, Data quality, RCM knowledge,Power BI , DAX • Testing Tools (Nice to Have): Great Expectations, Deequ, dbt (with tests), Pytest • Domain Expertise: US Healthcare (preferred) • Tools: SSMS, TOAD, SSIS, ADF, BI tools (Power BI, Tableau), Snowflake (nice to have) •Processes: Agile methodology, CI/CD integration, Test strategy & planning, Data reconciliation • Bonus: Python knowledge, Mentoring experience
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France