Jobs
Interviews

25009 Etl Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Delhi, India

On-site

With more than 45,000 employees and partners worldwide, the Customer Experience and Success (CE&S) organization is on a mission to empower customers to accelerate business value through differentiated customer experiences that leverage Microsoft’s products and services, ignited by our people and culture. We drive cross-company alignment and execution, ensuring that we consistently exceed customers’ expectations in every interaction, whether in-product, digital, or human-centered. CE&S is responsible for all up services across the company, including consulting, customer success, and support across Microsoft’s portfolio of solutions and products. Join CE&S and help us accelerate AI transformation for our customers and the world. The Global Customer Success (GCS) organization, an organization within CE&S, is leading the effort to enable customer success on the Microsoft Cloud by harnessing leading, AI-powered capabilities and human expertise to deliver innovation solutions that accelerate business value, drive operational excellence and nurture long term loyalty. Support for Mission Critical is a team within Microsoft that provides solution-specific expertise designed to drive peak health and optimum performance of a customer’s most important solutions. As a key technical resource for the customer, you will be primarily focused on delivering proactive services such as education workshops, delivering assessments, and providing tailored guidance. Troubleshooting skills are essential as this role will include working with Microsoft Support to expedite incident resolution. This role is flexible in that you can work up to 100% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Provide architectural reviews and technical guidance to Support for Mission Critical (SfMC) customers, focusing on the reliability, security, and performance. Take end-to-end ownership and accountability of technical deliverables, ensuring alignment with customer business outcomes and Microsoft’s best practices. Identify architectural risks, design gaps, and operational inefficiencies across services. Engage with SfMC stakeholders to drive architectural validation, incident prevention, and workload health improvements through proactive engagements** and **deep technical assessments. Collaborate closely with Microsoft engineering and support teams to address escalations, share feedback, and align solutions with platform evolution. Drive creation and reusability of IP including scripts, tools, and technical documentation to support scalable SfMC engagements. Act as a trusted advisor to customer architects and engineers, influencing long-term technical strategy for stability, resilience, and innovation. Qualifications 7+ years of experience in cloud data platforms, with a strong focus on Azure. Hands-on experience with Azure Databricks, Azure Machine Learning, Azure Data Factory, and Azure AI services (including Cognitive Services and OpenAI) in secure environments, including data warehousing, ETL pipelines, and real-time data processing. Proven expertise in data engineering, data science workflows, and ML model deployment using Azure tools. Experience designing and implementing end-to-end AI/ML solutions in enterprise environments. Strong understanding of distributed computing, big data processing, and data lake architectures. Familiarity with Cosmos DB and SQL Server will be helpful. Experience with Azure architecture, including IaaS, PaaS, and serverless components. Ability to use debugging tools, trace analysis, and source code to troubleshoot and optimize performance. Solid understanding of networking, security, and resilience in cloud-native applications. Knowledge of Power BI will be helpful. Strong problem-solving skills and ability to work collaboratively in cross-functional teams. Excellent communication skills in international environments – both spoken and written English. Effective learning and presentation skills, with comfort in addressing both small and large audiences. Ability to work under pressure and meet deadlines. Additional Qualifications Configure Azure Monitor, Log Analytics Workspaces, and Diagnostic Settings for telemetry ingestion. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

7.0 years

0 Lacs

Gujarat, India

On-site

With more than 45,000 employees and partners worldwide, the Customer Experience and Success (CE&S) organization is on a mission to empower customers to accelerate business value through differentiated customer experiences that leverage Microsoft’s products and services, ignited by our people and culture. We drive cross-company alignment and execution, ensuring that we consistently exceed customers’ expectations in every interaction, whether in-product, digital, or human-centered. CE&S is responsible for all up services across the company, including consulting, customer success, and support across Microsoft’s portfolio of solutions and products. Join CE&S and help us accelerate AI transformation for our customers and the world. The Global Customer Success (GCS) organization, an organization within CE&S, is leading the effort to enable customer success on the Microsoft Cloud by harnessing leading, AI-powered capabilities and human expertise to deliver innovation solutions that accelerate business value, drive operational excellence and nurture long term loyalty. Support for Mission Critical is a team within Microsoft that provides solution-specific expertise designed to drive peak health and optimum performance of a customer’s most important solutions. As a key technical resource for the customer, you will be primarily focused on delivering proactive services such as education workshops, delivering assessments, and providing tailored guidance. Troubleshooting skills are essential as this role will include working with Microsoft Support to expedite incident resolution. This role is flexible in that you can work up to 100% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Provide architectural reviews and technical guidance to Support for Mission Critical (SfMC) customers, focusing on the reliability, security, and performance. Take end-to-end ownership and accountability of technical deliverables, ensuring alignment with customer business outcomes and Microsoft’s best practices. Identify architectural risks, design gaps, and operational inefficiencies across services. Engage with SfMC stakeholders to drive architectural validation, incident prevention, and workload health improvements through proactive engagements** and **deep technical assessments. Collaborate closely with Microsoft engineering and support teams to address escalations, share feedback, and align solutions with platform evolution. Drive creation and reusability of IP including scripts, tools, and technical documentation to support scalable SfMC engagements. Act as a trusted advisor to customer architects and engineers, influencing long-term technical strategy for stability, resilience, and innovation. Qualifications 7+ years of experience in cloud data platforms, with a strong focus on Azure. Hands-on experience with Azure Databricks, Azure Machine Learning, Azure Data Factory, and Azure AI services (including Cognitive Services and OpenAI) in secure environments, including data warehousing, ETL pipelines, and real-time data processing. Proven expertise in data engineering, data science workflows, and ML model deployment using Azure tools. Experience designing and implementing end-to-end AI/ML solutions in enterprise environments. Strong understanding of distributed computing, big data processing, and data lake architectures. Familiarity with Cosmos DB and SQL Server will be helpful. Experience with Azure architecture, including IaaS, PaaS, and serverless components. Ability to use debugging tools, trace analysis, and source code to troubleshoot and optimize performance. Solid understanding of networking, security, and resilience in cloud-native applications. Knowledge of Power BI will be helpful. Strong problem-solving skills and ability to work collaboratively in cross-functional teams. Excellent communication skills in international environments – both spoken and written English. Effective learning and presentation skills, with comfort in addressing both small and large audiences. Ability to work under pressure and meet deadlines. Additional Qualifications Configure Azure Monitor, Log Analytics Workspaces, and Diagnostic Settings for telemetry ingestion. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

7.0 years

0 Lacs

Rajasthan, India

On-site

With more than 45,000 employees and partners worldwide, the Customer Experience and Success (CE&S) organization is on a mission to empower customers to accelerate business value through differentiated customer experiences that leverage Microsoft’s products and services, ignited by our people and culture. We drive cross-company alignment and execution, ensuring that we consistently exceed customers’ expectations in every interaction, whether in-product, digital, or human-centered. CE&S is responsible for all up services across the company, including consulting, customer success, and support across Microsoft’s portfolio of solutions and products. Join CE&S and help us accelerate AI transformation for our customers and the world. The Global Customer Success (GCS) organization, an organization within CE&S, is leading the effort to enable customer success on the Microsoft Cloud by harnessing leading, AI-powered capabilities and human expertise to deliver innovation solutions that accelerate business value, drive operational excellence and nurture long term loyalty. Support for Mission Critical is a team within Microsoft that provides solution-specific expertise designed to drive peak health and optimum performance of a customer’s most important solutions. As a key technical resource for the customer, you will be primarily focused on delivering proactive services such as education workshops, delivering assessments, and providing tailored guidance. Troubleshooting skills are essential as this role will include working with Microsoft Support to expedite incident resolution. This role is flexible in that you can work up to 100% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Provide architectural reviews and technical guidance to Support for Mission Critical (SfMC) customers, focusing on the reliability, security, and performance. Take end-to-end ownership and accountability of technical deliverables, ensuring alignment with customer business outcomes and Microsoft’s best practices. Identify architectural risks, design gaps, and operational inefficiencies across services. Engage with SfMC stakeholders to drive architectural validation, incident prevention, and workload health improvements through proactive engagements** and **deep technical assessments. Collaborate closely with Microsoft engineering and support teams to address escalations, share feedback, and align solutions with platform evolution. Drive creation and reusability of IP including scripts, tools, and technical documentation to support scalable SfMC engagements. Act as a trusted advisor to customer architects and engineers, influencing long-term technical strategy for stability, resilience, and innovation. Qualifications 7+ years of experience in cloud data platforms, with a strong focus on Azure. Hands-on experience with Azure Databricks, Azure Machine Learning, Azure Data Factory, and Azure AI services (including Cognitive Services and OpenAI) in secure environments, including data warehousing, ETL pipelines, and real-time data processing. Proven expertise in data engineering, data science workflows, and ML model deployment using Azure tools. Experience designing and implementing end-to-end AI/ML solutions in enterprise environments. Strong understanding of distributed computing, big data processing, and data lake architectures. Familiarity with Cosmos DB and SQL Server will be helpful. Experience with Azure architecture, including IaaS, PaaS, and serverless components. Ability to use debugging tools, trace analysis, and source code to troubleshoot and optimize performance. Solid understanding of networking, security, and resilience in cloud-native applications. Knowledge of Power BI will be helpful. Strong problem-solving skills and ability to work collaboratively in cross-functional teams. Excellent communication skills in international environments – both spoken and written English. Effective learning and presentation skills, with comfort in addressing both small and large audiences. Ability to work under pressure and meet deadlines. Additional Qualifications Configure Azure Monitor, Log Analytics Workspaces, and Diagnostic Settings for telemetry ingestion. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

7.0 years

0 Lacs

Uttar Pradesh, India

On-site

With more than 45,000 employees and partners worldwide, the Customer Experience and Success (CE&S) organization is on a mission to empower customers to accelerate business value through differentiated customer experiences that leverage Microsoft’s products and services, ignited by our people and culture. We drive cross-company alignment and execution, ensuring that we consistently exceed customers’ expectations in every interaction, whether in-product, digital, or human-centered. CE&S is responsible for all up services across the company, including consulting, customer success, and support across Microsoft’s portfolio of solutions and products. Join CE&S and help us accelerate AI transformation for our customers and the world. The Global Customer Success (GCS) organization, an organization within CE&S, is leading the effort to enable customer success on the Microsoft Cloud by harnessing leading, AI-powered capabilities and human expertise to deliver innovation solutions that accelerate business value, drive operational excellence and nurture long term loyalty. Support for Mission Critical is a team within Microsoft that provides solution-specific expertise designed to drive peak health and optimum performance of a customer’s most important solutions. As a key technical resource for the customer, you will be primarily focused on delivering proactive services such as education workshops, delivering assessments, and providing tailored guidance. Troubleshooting skills are essential as this role will include working with Microsoft Support to expedite incident resolution. This role is flexible in that you can work up to 100% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Provide architectural reviews and technical guidance to Support for Mission Critical (SfMC) customers, focusing on the reliability, security, and performance. Take end-to-end ownership and accountability of technical deliverables, ensuring alignment with customer business outcomes and Microsoft’s best practices. Identify architectural risks, design gaps, and operational inefficiencies across services. Engage with SfMC stakeholders to drive architectural validation, incident prevention, and workload health improvements through proactive engagements** and **deep technical assessments. Collaborate closely with Microsoft engineering and support teams to address escalations, share feedback, and align solutions with platform evolution. Drive creation and reusability of IP including scripts, tools, and technical documentation to support scalable SfMC engagements. Act as a trusted advisor to customer architects and engineers, influencing long-term technical strategy for stability, resilience, and innovation. Qualifications 7+ years of experience in cloud data platforms, with a strong focus on Azure. Hands-on experience with Azure Databricks, Azure Machine Learning, Azure Data Factory, and Azure AI services (including Cognitive Services and OpenAI) in secure environments, including data warehousing, ETL pipelines, and real-time data processing. Proven expertise in data engineering, data science workflows, and ML model deployment using Azure tools. Experience designing and implementing end-to-end AI/ML solutions in enterprise environments. Strong understanding of distributed computing, big data processing, and data lake architectures. Familiarity with Cosmos DB and SQL Server will be helpful. Experience with Azure architecture, including IaaS, PaaS, and serverless components. Ability to use debugging tools, trace analysis, and source code to troubleshoot and optimize performance. Solid understanding of networking, security, and resilience in cloud-native applications. Knowledge of Power BI will be helpful. Strong problem-solving skills and ability to work collaboratively in cross-functional teams. Excellent communication skills in international environments – both spoken and written English. Effective learning and presentation skills, with comfort in addressing both small and large audiences. Ability to work under pressure and meet deadlines. Additional Qualifications Configure Azure Monitor, Log Analytics Workspaces, and Diagnostic Settings for telemetry ingestion. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

7.0 years

0 Lacs

Dadra & Nagar Haveli, Daman and Diu, India

On-site

With more than 45,000 employees and partners worldwide, the Customer Experience and Success (CE&S) organization is on a mission to empower customers to accelerate business value through differentiated customer experiences that leverage Microsoft’s products and services, ignited by our people and culture. We drive cross-company alignment and execution, ensuring that we consistently exceed customers’ expectations in every interaction, whether in-product, digital, or human-centered. CE&S is responsible for all up services across the company, including consulting, customer success, and support across Microsoft’s portfolio of solutions and products. Join CE&S and help us accelerate AI transformation for our customers and the world. The Global Customer Success (GCS) organization, an organization within CE&S, is leading the effort to enable customer success on the Microsoft Cloud by harnessing leading, AI-powered capabilities and human expertise to deliver innovation solutions that accelerate business value, drive operational excellence and nurture long term loyalty. Support for Mission Critical is a team within Microsoft that provides solution-specific expertise designed to drive peak health and optimum performance of a customer’s most important solutions. As a key technical resource for the customer, you will be primarily focused on delivering proactive services such as education workshops, delivering assessments, and providing tailored guidance. Troubleshooting skills are essential as this role will include working with Microsoft Support to expedite incident resolution. This role is flexible in that you can work up to 100% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Provide architectural reviews and technical guidance to Support for Mission Critical (SfMC) customers, focusing on the reliability, security, and performance. Take end-to-end ownership and accountability of technical deliverables, ensuring alignment with customer business outcomes and Microsoft’s best practices. Identify architectural risks, design gaps, and operational inefficiencies across services. Engage with SfMC stakeholders to drive architectural validation, incident prevention, and workload health improvements through proactive engagements** and **deep technical assessments. Collaborate closely with Microsoft engineering and support teams to address escalations, share feedback, and align solutions with platform evolution. Drive creation and reusability of IP including scripts, tools, and technical documentation to support scalable SfMC engagements. Act as a trusted advisor to customer architects and engineers, influencing long-term technical strategy for stability, resilience, and innovation. Qualifications 7+ years of experience in cloud data platforms, with a strong focus on Azure. Hands-on experience with Azure Databricks, Azure Machine Learning, Azure Data Factory, and Azure AI services (including Cognitive Services and OpenAI) in secure environments, including data warehousing, ETL pipelines, and real-time data processing. Proven expertise in data engineering, data science workflows, and ML model deployment using Azure tools. Experience designing and implementing end-to-end AI/ML solutions in enterprise environments. Strong understanding of distributed computing, big data processing, and data lake architectures. Familiarity with Cosmos DB and SQL Server will be helpful. Experience with Azure architecture, including IaaS, PaaS, and serverless components. Ability to use debugging tools, trace analysis, and source code to troubleshoot and optimize performance. Solid understanding of networking, security, and resilience in cloud-native applications. Knowledge of Power BI will be helpful. Strong problem-solving skills and ability to work collaboratively in cross-functional teams. Excellent communication skills in international environments – both spoken and written English. Effective learning and presentation skills, with comfort in addressing both small and large audiences. Ability to work under pressure and meet deadlines. Additional Qualifications Configure Azure Monitor, Log Analytics Workspaces, and Diagnostic Settings for telemetry ingestion. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : PLSQL,SQL Writing,mSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a SQL Developer Intern to join our team remotely. As an intern, you will work with our database team to design, optimize, and maintain databases while gaining hands-on experience in SQL development. This is a great opportunity for someone eager to build a strong foundation in database management and data analysis. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 6 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

ETL Tester Experience- 4+ Yrs Relevant Experience: 3 to 6 years Location- Pune Budget- open Work mode - Hybrid/WFH Notice period - Immediate (currently not working) Archimedis helps clients manage operational, technological, and regulatory risk to enhance enterprise value Our growing practice is helping clients manage their enterprise -wide business, technology and regulatory risks and compliance on a sustained basis. We use proprietary tools, technologies and accelerators to allow clients a more proactive approach to managing risk and achieving regulatory compliance. We design comprehensive compliance programs. The Team At Archimedis, we're changing how we develop and deliver our next-generation Digital services and technology products. Our capabilities include designing the target operating model, assessing policies and procedures, testing and monitoring on a managed services and project basis, assessing risk, reporting and communicating, as well as building the analytics and reporting structures to allow for ongoing measurement and monitoring. ETL Tester Job Description 1. Develop and execute test plans, test cases, and test scripts to validate ETL processes, data transformations, and data migrations in compliance with GxP regulations and industry standards. 2. Perform functional, regression, and integration testing of ETL workflows, ensuring proper data extraction, cleansing, transformation, and loading across various data sources and targets. 3. Collaborate with business analysts, data engineers, and stakeholders to understand project requirements, data mappings, and business rules governing ETL processes. 4. Design and implement automated ETL test suites using testing frameworks and tools, and maintain regression test suites for ongoing validation of ETL pipelines. 5. Conduct data profiling and data quality assessments to identify anomalies, discrepancies, and data integrity issues, and work with data stewards to resolve data quality issues. 6. Document and report test results, defects, and validation findings using standardized reporting templates and issue tracking systems, and communicate findings to project stakeholders. 7. Ensure compliance with regulatory requirements, including GxP, FDA regulations, and other relevant industry standards for data integrity, traceability, and auditability. 8. Participate in validation activities, including validation planning, validation execution, and documentation of validation deliverables in accordance with regulatory guidelines and company SOPs. 9. Stay informed about industry trends, best practices, and emerging technologies in ETL testing and GxP compliance, and apply this knowledge to enhance testing methodologies and processes. 10. Contribute to process improvements, quality initiatives, and knowledge sharing activities within the testing team and across project teams, fostering a culture of continuous improvement and excellence. 11. AWS, Data bricks knowledge is essential 12 Python program skill is required to automate ETL validations Tech stack backround: AWS, Python, SQL, Snowflake, S3 How you will grow At Archimedis, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team-based learning, and eLearning.

Posted 6 days ago

Apply

5.0 years

0 Lacs

India

Remote

Dynamics CRM Developer with Azure Data Factory and Data Brick experience, 5+ Years of experience, Remote, Pan India, Below 30-day joiners, Mandatory Experiences: Dynamics CRM as primary skills, Azure Data Factory and Data Brick experience as secondary skills. ------------------------------------- Design, construct, install, test, and maintain highly scalable data management systems. Implement complex data warehousing projects with a focus on collecting, parsing, managing, analyzing, and visualizing large datasets to turn information into insights using Power BI. Ensure systems meet business requirements and industry practices by integrating new data management technologies and software engineering tools into existing structures. Create robust data pipelines using ETL processes that follow best practices in data modeling, ingestion, modeling, data cleansing, data enrichment, and transformation. Utilize Azure cloud services effectively to deploy and maintain a scalable data infrastructure. Collaborate with data Analysts, data scientists, and architects on several projects, ensuring that the optimal data delivery architecture is consistent throughout ongoing projects. Engage with stakeholders and team members to assist with data-related technical issues and support their data infrastructure needs. Develop high-performance algorithms, predictive models, and prototypes using Python and PySpark. -------------------------------------

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

The Analytics lead role within the Enterprise Data team requires an expert Power BI lead with profound data visualization experience, strong proficiency in DAX, SQL, and data modeling techniques. This position offers a unique opportunity to contribute to cutting-edge business analytics using advanced BI tools like cloud-based databases and self-service analytics, aligning with the company's vision of digital transformation. Responsibilities: - Lead and oversee a team of Power BI Developers, providing guidance and support in their daily tasks. - Design data visualization models and solutions within the Microsoft Azure ecosystem, including Power BI, Azure Synapse Analytics, MSFT Fabric, and Azure Machine Learning. - Develop strategies for analytics, reporting, and governance to ensure scalability, reliability, and security. - Collaborate with business stakeholders to define analytics and reporting strategies. - Ensure solutions are aligned with organizational objectives, compliance requirements, and technological advancements. - Act as a subject matter expert in Analytics services, mentoring senior/junior Power BI Developers. - Evaluate emerging technologies and analytical capabilities. - Provide guidance on cost optimization, performance tuning, and best practices in Azure cloud environments. Stakeholder Collaboration: - Work closely with business stakeholders, product managers, and data scientists to understand business goals and translate them into technical solutions. - Collaborate with DevOps, engineering, and operations teams to implement CI/CD pipelines for smooth deployment of analytical solutions. Governance and Security: - Define and implement policies for data governance, quality, and security, ensuring compliance with relevant standards such as GDPR and HIPAA. - Optimize solutions for data privacy, resilience, and disaster recovery. Qualifications: Required Skills and Experience: - Proficiency in Power BI and related technologies including MSFT Fabric, Azure SQL Database, Azure Synapse, Databricks, and other visualization tools. - Hands-on experience with Power BI, machine learning, and AI services in Azure. - Strong data visualization skills and experience. - 12+ years of Power BI Development experience, with a track record of designing high-quality models and dashboards. - 8+ years of experience using Power BI Desktop, DAX, Tabular Editor, and related tools. - Comprehensive understanding of data modeling, administration, and visualization. - Excellent leadership and communication skills. - Relevant certifications in Power BI, machine learning, AI, or enterprise architecture preferred. Key Competencies: - Expertise in data visualization tools like Power BI or Tableau. - Ability to create semantic models for reporting. - Familiarity with Microsoft Fabric technologies. - Strong understanding of data governance, compliance, and security frameworks. - Experience with DevOps and Infrastructure as Code tools. - Proven ability to drive innovation in data strategy and cloud solutions. - In-depth knowledge of business intelligence workflows and database design. - Experience in cloud-based data integration tools and agile development techniques. Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Dentsu Time Type: Full time Contract Type: Permanent,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As an ideal candidate, you should possess strong expertise in Core Java, focusing on Data Structures and Algorithms. Your proficiency in Object-Oriented Programming (OOPS) should be solid. Additionally, you should have a robust background in ETL processes and Oracle.,

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Experience : 3 to 6 yrs Location : Bangalore only Notice period : 30 days Mandatory Skills : SQL, Power BI (DAX calculation), Power Apps, Data modelling, along with AWS Cloud Experience Power automate Required Skills Bachelors degree in Computer Science, Business Administration, or related field Minimum of 3 years of experience in visual reporting development, including hands-on development of analytics dashboards and working with complex data sets. Minimum of 3years of Power BI development experience / SQL Server expertise Excellent Microsoft Office skills including advanced Excel skills / Strong analytical, quantitative, problem solving and organizational skills. Attention to detail and ability to coordinate multiple tasks, set priorities and meet deadlines. Responsibilities Design, develop, and maintain interactive dashboards and reports using Power BI. Collaborate with stakeholders to gather business requirements and translate them into technical specifications. Perform data modeling, DAX calculations, and performance tuning within Power BI. Integrate data from multiple sources including SQL Server, Excel, and cloud-based platforms. Ensure data accuracy, consistency, and security across all reporting solutions. Conduct ad-hoc analysis and present findings to business leaders. Document processes, data flows, and dashboard logic for transparency and scalability. Stay updated with the latest Power BI features and BI best practices. Qualifications Bachelor's degree in Computer Science, Information Systems, Business Analytics, or a related field. 5-6 years of experience in BI and reporting roles. Expertise in Power BI-including Power Query, DAX, and Power BI Service. Strong SQL skills for data extraction, transformation, and analysis. Exposure with data warehousing concepts and ETL processes. Familiarity with tools like SSIS, SSRS, or Azure Data Factory is a plus. Excellent communication skills and the ability to explain technical concepts to non-technical stakeholders. (ref:hirist.tech)

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Documentation Specialist, you will be responsible for creating world-class customer-facing documentation that delights and excites customers. Your role involves removing ambiguity by documenting information effectively, leading to increased team efficiency and effectiveness. Your efforts will help convert tacit knowledge into implicit knowledge. You will manage a full region or multiple customers within a region, owning end-to-end communication and status reporting to both leadership and customers. Your responsibilities include managing your portfolio, estimates, asset projection, unit metrics, tracking CARR (Contracted Annual Recurring Revenue), asset transfers, and cloud costs for fully owned projects. Additionally, you will provide valuable data insights to customers, identify early warning signs for issues, and collaborate with Customer Success stakeholders. Collaborating effectively with stakeholders, managing escalations, planning transitions, and initiating hiring efforts are key aspects of your role. You will also drive initiatives to achieve target profit gross margin and CSAT score for your allocated portfolio, while prioritizing work aspects amidst changing timeframes and incomplete information. Your leadership skills will be crucial in mentoring, grooming, assessing, and providing balanced feedback to your team members. Regular performance discussions and tracking Individual Development Plans are essential. Additionally, you will act as a backup SEM for another region. Required Skills: - Advanced SQL & Unix experience - Strong ETL & Python support skills - Hands-on knowledge of Analytics Tools (Power BI or Tableau) - Good Healthcare knowledge - Fundamental ITIL Expertise - Proficiency in Support Processes (SLAs, OLAs, Product or application support) - Project and Program management abilities - Escalation & Team management skills - Problem-solving mindset - Excellent written and verbal communication skills - Ambitious and adaptable to work in a flexible startup environment with a focus on achieving goals.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

telangana

On-site

The job description will be updated once the information is available. In the meantime, please refer to the qualifications section for details on the requirements for this position.,

Posted 1 week ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Minimum 6 years of experience in data engineering and analytics Strong hands-on experience in Oracle Analytics Cloud (OAC) and OCI Big Data Platform Proficiency in Spark, PySpark, Hive, and SQL Deep understanding of data integration, ETL pipelines, and data modeling techniques Experience working with large-scale data systems and cloud-based architectures Familiarity with data security, access control, and compliance best practices Strong analytical and problem-solving skills Excellent communication and team collaboration abilities. Roles & Responsibilities Create robust database structures, schemas, and data models using Oracle Database technologies to support business intelligence and analytics initiatives. Develop, schedule, and monitor ETL workflows using Oracle Data Integrator (ODI) or PL/SQL scripts to extract, transform, and load data from various sources. Perform tuning of Oracle queries, stored procedures, and indexes to enhance the speed and efficiency of data processing and storage. Implement validation rules, data governance standards, and security protocols to maintain high-quality, compliant, and secure datasets. Work closely with data analysts, architects, and business units to understand requirements and deliver scalable, reliable Oracle data solutions. Integrate on-premise Oracle databases with cloud platforms like Oracle Cloud Infrastructure (OCI), AWS, or Azure for hybrid data solutions. Participate in Agile ceremonies and work closely with DevOps teams for CI/CD pipeline integration of data workflows. Ensure high availability and performance of Oracle databases through tuning and monitoring. Utilize tools like Oracle Golden Gate or Oracle Streams to enable real-time data replication and change tracking across systems. Design and implement long-term data archiving strategies using Oracle ILM (Information Lifecycle Management. Develop solutions for moving data between Oracle and other platforms (e.g., SQL Server, Snowflake, PostgreSQL) via DB links or APIs. Job Location : Pune (ref:hirist.tech)

Posted 1 week ago

Apply

5.0 - 12.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Data Software Engineer, you will be responsible for utilizing your 5-12 years of experience in Big Data & Data-related technologies to contribute to the success of projects in Chennai and Coimbatore in a Hybrid work mode. You should possess an expert level understanding of distributed computing principles and a strong knowledge of Apache Spark, with hands-on programming skills in Python. Your role will involve working with technologies such as Hadoop v2, Map Reduce, HDFS, Sqoop, Apache Storm, and Spark-Streaming to build stream-processing systems. You should have a good grasp of Big Data querying tools like Hive and Impala, as well as experience in integrating data from various sources including RDBMS, ERP, and Files. Experience with NoSQL databases such as HBase, Cassandra, MongoDB, and knowledge of ETL techniques and frameworks will be essential for this role. You will be tasked with performance tuning of Spark Jobs, working with AZURE Databricks, and leading a team efficiently. Additionally, your expertise in designing and implementing Big Data solutions, along with a strong understanding of SQL queries, joins, stored procedures, and relational schemas will be crucial. As a practitioner of AGILE methodology, you will play a key role in the successful delivery of data-driven projects.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will participate in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your primary goal will be to contribute to applications systems analysis and programming activities. You should have hands-on experience in ETL and Big Data Testing, delivering high-quality solutions. Proficiency in Database & UI Testing using Automation tools is essential. Knowledge of Performance, Volume & Stress testing is required. You must have a strong understanding of SDLC / STLC processes, different types of manual Testing, and be well-versed in Agile methodology. Your responsibilities will include designing and executing test cases, authoring user stories, defect tracking, and aligning with business requirements. You should be open to learning and bringing new innovations in automation processes as per project needs. Managing complex tasks and teams, fostering a collaborative, growth-oriented environment through strong technical and analytical skills is a key aspect of this role. You will utilize your knowledge of applications development procedures, concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code. Familiarity with Test Management Tool - JIRA and Automation Tools such as Python, PySpark, Java, Spark, MySQL, Selenium, and Tosca is required. Experience with Hadoop / ABINTIO is considered a plus. In terms of testing, you will focus on ETL, Big Data, Database, and UI. Domain experience in Banking and Finance is preferred. You will consult with users, clients, and other technology groups on issues and recommend programming solutions, install, and support customer exposure systems. Qualifications: - 4-8 years of relevant experience in the Financial Service industry - Intermediate level experience in Applications Development role - Clear and concise written and verbal communication skills - Problem-solving and decision-making abilities - Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: - Bachelors degree/University degree or equivalent experience This job description provides a high-level overview of the work performed. Other job-related duties may be assigned as required.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a Master's Degree with a minimum of 6 years of experience or a Bachelor's degree or foreign equivalent with a minimum of 8 years of experience. Your experience should include working in the full System Development Life Cycle (SDLC) on various technologies and platforms, preferably in Property and Casualty and/or Life Insurance. You must have at least 4-8 years of experience in one or more of the following skills in the P&C/Life Insurance domain: - Scrum/Agile experience in a Product Owner role or similar on an Agile Team environment. - Excellent understanding of how to operate in an Agile Team setting. - Strong collaboration skills. - Experience in Agile Poker, WSJF, and similar Agile estimating tools. - Successful track record of backlog refinement for a complex system implementation. - Proven ability to quickly learn new complex systems. - Practical knowledge of correctly assigning story points. - Demonstrated experience in User Story refinement. - Proven record of minimal re-work after Story acceptance. - Familiarity with Good Documentation practices. - Ability to translate business requirements into well-formed technical and system requirements. - Experience with modeling business processes, data flows, and workflow processes showing relationships between Insurance data entities. Moreover, you should possess: - Strong business consulting skills related to complex IT system implementation spanning over 12-18 months. - Strong oral and written communication skills, as well as business stakeholder management. - Experience in business requirement analysis and use case modeling from a business analysis/consulting perspective. - Experience and willingness to work in a management consulting environment that involves regular travel. - Database/SQL experience and expertise. - Knowledge of ETL and Datawarehouse concepts and processes. - Exposure to policy and claims data migration. - Previous experience with tools like MicroStrategy, Tableau, and Qlik. - A unique combination of technical skills and business acumen, along with a high level of confidence.,

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Tech Lead - Data Bricks Job Date: Aug 2, 2025 Job Requisition Id: 59586 Location: Hyderabad, TG, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Bricks Professionals in the following areas : Experience 8+ Years Job Description Over all 8+ + Years experience and a Minimum 3+Years exp in Azure should have worked as lead for at least 3 year Should come from DWH background, Should have strong ETL experience Strong have strong hands on experience in Azure Data Bricks/Pyspark Strong have strong hands on experience inAzure Data Factory, Devops Strong knowledge on Bigdata stack Strong Knowledge of Azure EventHubs and Pub-Sub model, security Strong Communication and Analytical skills. Highly proficient at SQL development Experience working in an Agile environment Work as team lead to develop Cloud Data and Analytics solutions Mentor junior developers and testers Able to build strong relationships with client technical team Participate in the development of cloud data warehouses, data as a service, business intelligence solutions Data wrangling of heterogeneous data Coding complex Spark (Scala or Python). Required Behavioral Competencies Accountability : Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration : Shares information within team, participates in team activities, asks questions to understand other points of view. Agility : Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus : Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication : Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results : Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict : Displays sensitivity in interactions and strives to understand others’ views and concerns. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow - people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. This position involves developing batch and real-time data pipelines utilizing various data analytics processing frameworks in support of Data Science and Machine Learning practices. You will assist in integrating data from various sources, both internal and external, performing extract, transform, load (ETL) data conversions, and facilitating data cleansing and enrichment. Additionally, you will be involved in full systems life cycle management activities, including analysis, technical requirements, design, coding, testing, and implementation of systems and applications software. The role also entails synthesizing disparate data sources to create reusable and reproducible data assets, as well as assisting the Data Science community in analytical model feature tuning. Responsibilities include contributing to data engineering projects and building solutions by leveraging foundational knowledge in software/application development, programming languages for statistical modeling and analysis, data warehousing, and Cloud solutions. You will collaborate effectively, produce data engineering documentation, gather requirements, organize data, and define project scopes. Data analysis and presentation of findings to stakeholders to support business needs will be part of your tasks. Additionally, you will participate in the integration of data for data engineering projects, understand and utilize analytic reporting tools and technologies, and assist with data engineering maintenance and support. Defining data interconnections between operational and business functions, backup and recovery, and utilizing technology solutions for POC analysis are also key responsibilities. Requirements for this role include understanding of database systems and data warehousing solutions, data life cycle stages, data environment scalability, data security, regulations, and compliance. You should be familiar with analytics reporting technologies, algorithms, data structures, Cloud services platforms, ETL tools capabilities, Machine learning algorithms, building data APIs, and coding using programming languages for statistical analysis and modeling. Basic knowledge of distributed systems and a Bachelor's degree in MIS, mathematics, statistics, computer science, or equivalent job experience are necessary qualifications. This is a permanent position at UPS, committed to providing a workplace free of discrimination, harassment, and retaliation.,

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Tech Lead -Azure Databricks/ Azure Data Factory Job Date: Aug 2, 2025 Job Requisition Id: 61535 Location: Gurgaon, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 8+ Years Job Description Position: Data Analytics Lead. Experience: 8+ Years. Responsibilities: Build, manage, and foster a high-functioning team of data engineers and Data analysts. Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. Experience of working with manufacturing industry in building a centralized data platform for self service reporting. Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. Responsible for managing customer, partner, and internal data on the cloud and on-premises. Evaluate and understand current data technologies and trends and promote a culture of learning. Build and end to end data strategy from collecting the requirements from business to modelling the data and building reports and dashboards Required Skills: Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks Accountable for the data group’s activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. Experience in designing and building operationally efficient pipelines, utilising core Azure components, such as Azure Data Factory, Azure Databricks and Pyspark etc Strong understanding of data architecture, data modelling, and ETL processes. Proficiency in SQL and Pyspark Strong knowledge of building PowerBI reports and dashboards. Excellent communication skills Strong problem-solving and analytical skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering And Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture Tools And Frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture Concepts And Principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Embark on a transformative journey as a Data Test Lead at Barclays, where the vision is clear to redefine the future of banking and craft innovative solutions. In this role, you will be responsible for creating and enhancing the data that drives the bank's financial transactions, placing data quality at the forefront of all operations. This presents a unique opportunity to shape the organization's data usage and be a part of an exciting transformation in the banking sector. To excel as a Data Test Lead, you should have experience with a diverse range of solutions including Fraud Detection, Fraud Servicing & IDV, Application Fraud, and Consumption BI patterns. Strong Test Automation skills are essential, along with the ability to create frameworks for regression packs. Providing technical guidance and driving the Test Automation team is crucial, emphasizing proactive automation to ensure alignment with the development lifecycle. Collaborating on the DevOps agenda, configuring Jenkins/GitLab pipelines, and maturing automation capabilities through proper documentation are key responsibilities. Additional valued skills for this role include collaborating with development teams to ensure testability and quality throughout the SDLC, identifying opportunities for test optimization, and mentoring junior QA engineers on automation best practices. Effective communication skills, SQL proficiency, working knowledge of Oracle, Hadoop, Pyspark, Ab-initio, and other ETL tools, as well as experience with metadata, domain maintenance, and JIRA, are also highly advantageous. The purpose of this role is to design, develop, and execute testing strategies to validate functionality, performance, and user experience, while working closely with cross-functional teams to identify and resolve defects. The Accountabilities include developing and implementing comprehensive test plans, executing automated test scripts, analysing requirements, conducting root cause analysis, and staying informed of industry technology trends. As an Assistant Vice President, you are expected to advise and influence decision-making, contribute to policy development, and lead a team performing complex tasks with professionalism and expertise. People Leaders are also expected to demonstrate leadership behaviours that create an environment for colleagues to excel. Colleagues at Barclays are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their daily interactions and work.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You have over 8 years of experience and are located in Balewadi, Pune. You possess a strong understanding of Data Architecture and have led data-driven projects. Your expertise includes knowledge of Data Modelling paradigms like Kimball, Inmon, Data Marts, Data Vault, Medallion, etc. Experience with Cloud Based data strategies, particularly AWS, is preferred. Designing data pipelines for ETL with expert knowledge on ingestion, transformation, and data quality is a must, along with hands-on experience in SQL. In-depth understanding of PostGreSQL development, query optimization, and designing indexes is a key requirement. Proficiency in Postgres PL/SQL for complex warehouse workflows is necessary. You should be able to manipulate intermediate to complex SQL and use advanced SQL concepts like RANK, DENSE_RANK, and apply advanced statistical concepts through SQL. Working experience with PostGres SQL extensions like PostGIS is desired. Expertise in writing ETL pipelines combining Python + SQL is required, as well as understanding of data manipulation libraries in Python like Pandas, Polars, DuckDB. Experience in designing Data visualization with tools such as Tableau and PowerBI is desirable. Your responsibilities include participation in designing and developing features in the existing Data Warehouse, providing leadership in establishing connections between Engineering, product, and analytics/data scientists team. Designing, implementing, and updating existing/new batch ETL pipelines, defining and implementing data architecture, and working with various data orchestration tools like Apache Airflow, Dagster, Prefect, and others. Collaboration with engineers and data analysts to build reliable datasets that can be trusted and used by the company is essential. You should be comfortable in a fast-paced start-up environment, passionate about your job, and enjoy a dynamic international working environment. Background or experience in the telecom industry is a plus, though not mandatory. You should have a penchant for automating tasks and enjoy monitoring processes.,

Posted 1 week ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

This job is with Amazon, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Description Want to participate in building the next generation of online payment system that supports multiple countries and payment methods? Amazon Payment Services (APS) is a leading payment service provider in MENA region with operations spanning across 8 countries and offers online payment services to thousands of merchants. APS team is building robust payment solution for driving the best payment experience on & off Amazon. Over 100 million customers send tens of billions of dollars moving at light-speed through our systems annually. We build systems that process payments at an unprecedented scale with accuracy, speed and mission-critical availability. We innovate to improve customer experience, with support for currency of choice, in-store payments, pay on delivery, credit and debit card payments, seller disbursements and gift cards. Many new exciting & challenging ideas are in the works. Key job responsibilities Data Engineers focus on managing data requests, maintaining operational excellence, and enhancing core infrastructure. You will be collaborating closely with both technical and non-technical teams to design and execute roadmaps Basic Qualifications 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We're looking for candidates with Syniti and other programming skills to join the EY GDS SAP BI & Data. This is a fantastic opportunity to be part of a leading firm while being instrumental in the growth. Your key responsibilities include: - Providing expert level business analysis on SAP modules FI, CO, MM, SD, PM, PP, PS - Implementing and developing customer deliverables that meet or exceed customer requirements - Developing and demonstrating a good understanding of business processes for the assigned functional area/data objects - Demonstrating a strong knowledge of underlying technical data structures and definitions for the assigned functional process area/data objects - Contributing to an integrated data solution through data analysis, reporting, and collaboration with on-site colleagues and clients - Expertise in SAP BW7.5, SAP BW on HANA/BW4 HANA - Working closely with other consultants on customer site as part of small to large size project teams - Conducting requirements analysis, data analysis, and creating reports - Maintaining responsibility for completion and accuracy of the deliverables - Actively expanding consulting skills and professional development through training courses, mentoring, and daily interaction with clients Skills and attributes for success: - Hands-on experience of SAP BW7.5, HANA implementation, and support - Building an understanding of Standard and custom SAP BW Extractors functionality with ABAP Debugging skills - Prior experience in Supporting ETL and Incident management/Bug-Fix - Hands-on Experience in Understanding and Applying transformations using ABAP and AMDP, advanced DSOs, Composite Providers using LSA ++ and performance optimization concepts - Prior Experience with Traditional non-HANA BW data modeling, Multi-Cubes, ODS Objects, Info Cubes, Transfer Rules, Start Routines, End Routines, Info Set Queries, Info Objects, and User Exits - Hands-on experience with SAP HANA data modeling views (Attribute, Analytics, and Calculation views) - Proficient in Development and understanding of SAP Analysis for Microsoft Office to perform custom calculations, filtering, and sorts to support complex business planning and reporting scenarios - Hands-on experience in the collection of Transport Requests through the landscape - Experience in Performance tuning and troubleshooting /Monthly Release activities as necessary - Knowledge of SAP ECC Business processes, functional aspects in Sales, Billing, Finance, Controlling, Project systems To qualify for the role, you must have: - Minimum 7+ years of SAP Analytics/Business Intelligence/Business Warehouse (BI/BW/HANA) related experience with a professional services advisory firm or publicly traded company and experience leading and delivering full lifecycle implementations - Minimum 1 end to end implementation experience with SAP HANA 1.0 and 2.0 with at least 1 number of full lifecycle project implementation experience with SAP HANA SQL and/or SAP S/4 HANA Ideally, you should also have: - Bachelor's degree from an accredited college/university - Hands-on experience on SAP HANA Modeling: Table creation (row store, column store), ABAP Procedures, data modeling, modeling views (Calculation, Attributes views), decision tables, analytical privilege will be an added advantage - Knowledge in roles and authorizations What we look for: - A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment with consulting skills - An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide - Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: PySpark Data Engineer Experience: 5 – 8 Years Location: Hyderabad Employment Type: Full-Time Job Summary We are looking for a skilled and experienced PySpark Data Engineer to join our growing data engineering team. The ideal candidate will have 5–8 years of experience in designing and implementing data pipelines using PySpark , AWS Glue , and Apache Airflow , with strong proficiency in SQL . You will be responsible for building scalable data processing solutions, optimizing data workflows, and collaborating with cross-functional teams to deliver high-quality data assets. Key Responsibilities Design, develop, and maintain large-scale ETL pipelines using PySpark and AWS Glue. Orchestrate and schedule data workflows using Apache Airflow. Optimize data processing jobs for performance and cost-efficiency. Work with large datasets from various sources, ensuring data quality and consistency. Collaborate with Data Scientists, Analysts, and other Engineers to understand data requirements and deliver solutions. Write efficient, reusable, and well-documented code following best practices. Monitor data pipeline health and performance; resolve data-related issues proactively. Participate in code reviews, architecture discussions, and performance tuning. Requirements 5–8 years of experience in data engineering roles. Strong expertise in PySpark for distributed data processing. Hands-on experience with AWS Glue and other AWS data services (S3, Athena, Lambda, etc.). Experience with Apache Airflow for workflow orchestration. Strong proficiency in SQL for data extraction, transformation, and analysis. Familiarity with data modeling concepts and data lake/data warehouse architectures. Experience with version control systems (e.g., Git) and CI/CD processes. Ability to write clean, scalable, and production-grade code. Benefits Company standard benefits.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies