Jobs
Interviews

25068 Etl Jobs - Page 43

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Greater Bengaluru Area

On-site

Area(s) of responsibility ETL TESTING JD The Skills that are Key to this role Technical Develop and execute test plans, test cases, and test scripts for ETL processes. Expertise in validate data extraction, transformation, and loading workflows. Writing PL SQL Queries/Procedures and managing databases, validating data transformations, and ensuring data integrity Identify and report data quality issues and inconsistencies. Collaborate with ETL developers and data engineers to resolve data quality issues. Analyze test results and provide detailed reports to stakeholders. Automate repetitive testing tasks to improve efficiency. Ensure compliance with industry standards and best practices in data quality assurance. Experience with tools like Informatica, Control-M, and DataStage for automating data extraction and transformation processes Understanding of data warehousing architectures and schemas to ensure effective data integration Experience building, maintaining, and optimizing automated test cases Good to have experience with Selenium, Cucumber, Java, Shell, groovy scripting. Experience with automated application build, deployment, and support using Maven and Ant Experience with performing version control and continuous integration of build, deploy, and test, using Jenkins, Stash Designing innovative technical solutions using Automation practices. Experience in framework development and maintenance. Experience working with AWS is a big plus! Experience as a developer (e.g.- Java, Spring) a plus Communicate effectively within team as well as with partners

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Dear Aspirant! We empower our people to stay resilient and relevant in a constantly changing world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant international team. We are looking for: Senior Software Developer – Python , You’ll make an impact by: Write high-quality, testable, and maintainable Python code using object-oriented programming (OOP), SOLID principles, and design patterns. Develop RESTful APIs and backend services for AI/ML model serving using FastAPI. Collaborate with AI/ML engineers to integrate and deploy Machine Learning, Deep Learning, and Generative AI models into production environments. Contribute to software architecture and design discussions to ensure scalable and efficient solutions. Implement CI/CD pipelines and adhere to DevOps best practices for reliable and repeatable deployments. Design for observability, incorporating structured logging, performance monitoring, and alerting mechanisms. Optimize code and system performance, ensuring reliability and robustness at scale. Participate in code reviews, promote clean code practices, and mentor junior developers when needed. Use your skills to move the world forward! Bachelor’s or Master’s degree in Computer Science, IT, or a related field. 5+ years of hands-on experience in software development, with a focus on Python. Deep understanding of OOP concepts, software architecture, and design patterns. Experience with backend web frameworks, preferably FastAPI. Familiarity with integrating ML/DL models into software solutions. Practical experience with CI/CD, containerization (Docker), and version control systems (Git). Exposure to MLOps practices and tools for model deployment and monitoring. Strong collaboration and communication skills in cross-functional engineering teams. Familiarity with cloud platforms like AWS (e.g., Sagemaker, Bedrock) or Azure (e.g., ML Studio, OpenAI Service). Experience in Rust is a strong plus. Experience working on high-performance, scalable backend systems. Exposure to logging/monitoring stacks like Prometheus, Grafana, ELK, or Open Telemetry. Understanding of data engineering concepts, ETL pipelines, and processing large datasets. Background or interest in the Power and Energy domain is a plus. Create a better #TomorrowWithUs! This role is based in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries - and the shape of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at: www.siemens.com/careers Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds

Posted 1 week ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Key Job Responsibilities: VOC - VI (Vulnerability Intelligence), ASM (Attack Surface Management) & VM (Vulnerability Management) Expert. Environment / Context Saint Gobain, world leader in the habitat and construction market, is one of the top 100 global industrial groups. Saint-Gobain is present in 68 countries with 171 000 employees. They design, manufacture and distribute materials and solutions which are key ingredients in the wellbeing of each of us and the future of all. They can be found everywhere in our living places and our daily life: in buildings, transportation, infrastructure and in many industrial applications. They provide comfort, performance and safety while addressing the challenges of sustainable construction, resource efficiency and climate change Saint-Gobain GDI Grou p (250 persons at the head office, including 120 that are internal) is responsible for defining, setting up and managing the Group's Information Systems (IS) and Telecom policy with its 1,000 subsidiaries in 6,500 sites worldwide. The GDI Groupe also carries the common means (infrastructures, telecoms, digital platforms, cross-functional applications ). IN DEC, the IT Development Centre of Saint-Gobain, is an entity with a vision to leverage India’s technical skills in the Information Technology domain to provide timely, high-quality and cost-effective IT solutions to Saint-Gobain businesses globally.Within the Cybersecurity Department, t he Cybersecurity Vulnerability Operations Cent er mission is to Identify, assess and confirm vulnerability and threats that can affect the Group. The CyberVOC teams are based out of Paris and Mumbai and consist of skilled persons working in different Service Lines. Mission We are seeking a highly experienced cybersecurity professional to serve as an VOC Expert supporting the Vulnerability Intelligence (VI), Attack Surface Management (ASM), and Vulnerability Management (VM) teams. This role is pivotal in shaping the strategy, defining technical approaches, and supporting day-to-day operations—particularly complex escalations and automation efforts. The ideal candidate will combine technical mastery in offensive security with practical experience in vulnerability lifecycle management and external attack surface discovery. The expert will act as a senior advisor and technical authority for the analyst teams, while also contributing to the design, scripting, and documentation of scalable security proceess. The VOC Expert is responsible for: Vulnerability Intelligence (VI) Drive the qualification and risk analysis of newly disclosed vulnerabilities. Perform exploit PoC validation when needed to assess practical risk. Maintain and enhance the central VI database, enriched with (EPSS, CVSS, QVS, SG-specific scoring models, and EUVD) Define and automate workflows for: Vulnerability qualification, exposure analysis, and prioritization Ingestion of qualified vulnerability data into the enterprise Data Lake Collaborate on documentation of VI methodology and threat intelligence integration Support proactive communication of high/critical vulnerabilities to asset and application owners Attack Surface Management (ASM): Operate and enhance external asset discovery and continuous monitoring using ASM tools Integrate asset coverage data from CMDB, and other internal datasets Design and implement scripts for: WHOIS/ASN/banner correlation Data enrichment and alert filtering Deploy and maintain custom scanning capabilities (e.g., Nuclei integrations) Provide expert input on threat modeling based on exposed assets and external footprint BlackBox Pentesting: Maintain the service delivery of the BlackBox Pentesting platform Automate the export of pentest data and integrate into Data Lake and Power BI dashboards Define and document onboarding workflows for new applications Actively guide analysts in prioritizing pentest requests and validating results. Vulnerability Management: Vulnerability review, recategorization, and false positive identification Proactive vulnerability testing and replay Pre-analyze and consolidate vulnerability data from various scanning tools Prepare concise syntheses of available vulnerabilities Offer guidance to the SO and CISO on vulnerabilities Collaborate with key stakeholders to develop strategies for vulnerability management Assist in defining vulnerability management KPIs and strategic goals Prepare concise, actionable summaries for high-risk vulnerabilities and trends Automate testing actions: Develop scripts and tooling to automate repetitive and complex tasks across VI, ASM and VM. Implement data pipelines to sync outputs from ASM/VI tools to dashboards and reporting engines. Design streamlined workflows for vulnerability lifecycle—from detection to closure. Collaborate with both offensive and defensive teams to support App managers and Asset managers in remediating vulnerabilities and issues. Skills and Qualifications: Bachelor's degree in Computer Science, Information Security, EXTC or related field; relevant certifications (e.g., CISSP, CCSP, CompTIA Security+) are a plus Proven experience (10+ years) working within the Cybersecurity field, with a focus on offensive security, vulnerability intelligence and attack surface analysis. Proven experience on Penetration testing actions (web application, infrastructure, …) Proven expertise in: CVE analysis, exploit development/validationExternal asset discovery & mapping Threat modeling and prioritizationAdvanced knowledge of tooling such as: ASM platforms Nuclei, Shodan, Open Source CTI, vulnerability scanners (Qualys, Tenable, …) Pentester tools (Burp, SQLmap, Responder, IDA and Kali environment) Experience in investigating newly published vulnerabilities, assessing their risks, severity. Strong scripting languages (e.g., Python, Bash, Powershell, C#, …) for automation and customization Experience with Pentester tools (Burp, SQLmap and Kali environment) Strong technical skills with an interest in open-source intelligence investigations Experience building dashboards in Power BI or similar tools. Familiarity with data lakes, API integrations, and ETL processes. Knowledge of NIST CVE database, OWASP Top 10, Microsoft security bulletins Excellent writing skills in English and ability to communicate complicate technical challenges in a business language to a range of stakeholders. Personal Skills: Has a systematic, disciplined, and analytical approach to problem solving with Thorough leadership skills & experience Excellent ability to think critically underpressure Strong communication skills to convey technical concepts clearly to both technical and non-technical stakeholders Willingness to stay updated with evolving cyber threats, technologies, and industry trends Capacity to work collaboratively with cross-functional teams, developers, and management to implement robust security measures Additional Information: The position is based in Mumbai (India)

Posted 1 week ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description: We are seeking a highly skilled and experienced Big Data Architect cum Data Engineer to join our dynamic team. The ideal candidate should have a strong background in big data technologies, designing the solutions and hands-on expertise in PySpark, Databricks, SQL. This position requires significant experience in building and managing data solutions in Databricks on Azure. The candidate should also have strong communication skills, experience in managing mid size teams, client conversations and presenting PoV and thought leadership. Responsibilities: · Design and implement scalable big data architectures and solutions utilizing PySpark, SparkSQL, and Databricks on Azure or AWS. · Experience in building robust data models and maintain metadata-driven frameworks to optimize data processing and analytics. · Build, test, and deploy sophisticated ETL pipelines using Azure Data Factory and other Azure-based tools. · Ensure seamless data flow from various sources to destinations including ADLS Gen 2. · Implement data quality checks and validation frameworks. · Establish and enforce data governance principles ensuring data security and compliance with industry standards and regulations. · Manage version control and deployment pipelines using Git and DevOps best practices. · Provide accurate effort estimation and manage project timelines effectively. · Collaborate with cross-functional teams to ensure aligned project goals and objectives. · Leverage industry knowledge in banking, insurance, and pharma to design tailor-made data solutions. · Stay updated with industry trends and innovations to proactively implement cutting-edge technologies and methodologies. · Facilitate discussions between technical and non-technical stakeholders to drive project success. · Document technical solutions and design specifications clearly and concisely. Qualifications: · Bachelor's degree in computer science, Engineering, or a related field. Master’s Degree preferred. · 8+ years of experience in big data architecture and engineering. · Extensive experience with PySpark, SparkSQL, and Databricks on Azure. · Proficient in using Azure Data Lake Storage Gen 2, Azure Data Factory, Azure Event Hub, Synapse. · Strong experience in data modeling, metadata frameworks, and effort estimation. · Experience of DevSecOps practices with proficiency in Git. · Demonstrated experience in implementing data quality, data security, and data governance measures. · Industry experience in banking, insurance, or pharma is a significant plus. · Excellent communication skills, capable of articulating complex technical concepts to diverse audiences. · Certification in Azure, Databricks or related Cloud technologies is a must. · Familiarity with machine learning frameworks and data science methodologies would be preferred. Mandatory skill sets: Data Architect/Data Engineer/AWS Preferred skill sets: Data Architect/Data Engineer/AWS Years of experience required: 8--12 years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master Degree, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 1 week ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description: We are seeking a highly skilled and experienced Big Data Architect cum Data Engineer to join our dynamic team. The ideal candidate should have a strong background in big data technologies, designing the solutions and hands-on expertise in PySpark, Databricks, SQL. This position requires significant experience in building and managing data solutions in Databricks on Azure. The candidate should also have strong communication skills, experience in managing mid size teams, client conversations and presenting PoV and thought leadership. Responsibilities: · Design and implement scalable big data architectures and solutions utilizing PySpark, SparkSQL, and Databricks on Azure or AWS. · Experience in building robust data models and maintain metadata-driven frameworks to optimize data processing and analytics. · Build, test, and deploy sophisticated ETL pipelines using Azure Data Factory and other Azure-based tools. · Ensure seamless data flow from various sources to destinations including ADLS Gen 2. · Implement data quality checks and validation frameworks. · Establish and enforce data governance principles ensuring data security and compliance with industry standards and regulations. · Manage version control and deployment pipelines using Git and DevOps best practices. · Provide accurate effort estimation and manage project timelines effectively. · Collaborate with cross-functional teams to ensure aligned project goals and objectives. · Leverage industry knowledge in banking, insurance, and pharma to design tailor-made data solutions. · Stay updated with industry trends and innovations to proactively implement cutting-edge technologies and methodologies. · Facilitate discussions between technical and non-technical stakeholders to drive project success. · Document technical solutions and design specifications clearly and concisely. Qualifications: · Bachelor's degree in computer science, Engineering, or a related field. Master’s Degree preferred. · 8+ years of experience in big data architecture and engineering. · Extensive experience with PySpark, SparkSQL, and Databricks on Azure. · Proficient in using Azure Data Lake Storage Gen 2, Azure Data Factory, Azure Event Hub, Synapse. · Strong experience in data modeling, metadata frameworks, and effort estimation. · Experience of DevSecOps practices with proficiency in Git. · Demonstrated experience in implementing data quality, data security, and data governance measures. · Industry experience in banking, insurance, or pharma is a significant plus. · Excellent communication skills, capable of articulating complex technical concepts to diverse audiences. · Certification in Azure, Databricks or related Cloud technologies is a must. · Familiarity with machine learning frameworks and data science methodologies would be preferred. Mandatory skill sets: Data Architect/Data Engineer/AWS Preferred skill sets: Data Architect/Data Engineer/AWS Years of experience required: 8--12 years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Architecture Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 1 week ago

Apply

15.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Organization: At CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things. Job Title: Principal Data Engineer Location: Manyata Tech Park, Bangalore Business & Team: CTO -Engineering Platform Impact & contribution: Here, you will create, manage and optimize the platforms used to power our entire business. You will have the opportunity to take responsibility for continuous improvement, ensuring our systems are simpler, faster and more secure. The cloud movement at CommBank is going strong and continues to grow. We are looking for out of the box thinkers who want to use technology to work on real-world problems that have the potential to change the lives of our 17 million+ customers. We support our people with the flexibility to balance where work is done with at least half your time each month connecting in office. We also have many other flexible working options available including changing start and finish times, part-time arrangements and job share to name a few. Talk to us about how these arrangements might work for you. As a Principal Engineer you will be responsible for various aspects such as Data Engineering mindset, DevSecOps mindset, and Engineering Quality and Productivity mindset. They also contribute to addressing high and medium-rated issues and planned work addressing underlying actions. Roles & responsibilities: Lead the design and architecture of scalable, maintainable, and secure full-stack applications. Collaborate with product managers, designers, and other stakeholders to translate business requirements into technical solutions. Ensure adherence to best practices in software design, including modularity, reusability, and maintainability. Utilise AWS services such as S3, Lambda, Glue, Step Functions, and CloudWatch to build and manage data solutions. Implement best practices for AWS security, cost management, and performance optimisation. Stay updated with the latest AWS offerings and evaluate their potential impact on the data infrastructure. Lead the design and implementation of data management solutions, ensuring data integrity, quality, and accessibility. Develop and maintain data models, data pipelines, and ETL processes. Implement data governance and compliance measures to adhere to regulatory requirements. Essential skills: Minimum 15+Years of experience Utilize AWS services such as S3, Lambda, Glue, Step Functions, and CloudWatch to build and manage data solutions. Implement best practices for AWS security, cost management, and performance optimization. Stay updated with the latest AWS offerings and evaluate their potential impact on our data infrastructure. Demonstrated expertise in dashboard and backend development, with a strong focus on SQL. Proficient in JavaScript, Python, SQL, and DBT. Extensive experience in building data lakes and data warehouses. Mastery of both SQL and NoSQL databases, data ingestion processes, ETL pipelines, and data integration. Architect and implement real-time data streaming solutions using technologies such as Apache Kafka, AWS Kinesis, or similar. Ensure high availability, scalability, and low latency of data streaming pipelines. Collaborate with data scientists and analysts to enable real-time data analytics and insights. Proficient in data profiling techniques and tools. Lead the design and implementation of data management solutions, ensuring data integrity, quality, and accessibility. Develop and maintain data models, data pipelines, and ETL processes. Implement data governance and compliance measures to adhere to regulatory requirements. Education Qualification: Bachelor’s degree or master’s degree in engineering in Computer Science/Information Technology. If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career. We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696. Advertising End Date: 31/08/2025

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Data Engineer What You Will Do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard processes for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / bachelor’s degree and 5 to 9 years’ experience in Computer Science, IT or related field Must Have: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Experience in Real World Data/ Health Care Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions - SaT– DnA Senior Analyst EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Senior Analyst - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your Key Responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud ( MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills And Attributes For Success Minimum of 6 years of total experience with 3+ years in Data warehousing/ Business Intelligence field Solid hands-on 3+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas ,data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Minimum 3+ years experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 2+ years hands-on experience in various Azure services preferred – Azure Data Factory,Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics ,Azure Analysis Services & Databricks Minimum of 3 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases ,Oracle/MySQL, Azure SQL and Azure Synapse Strong in PySpark, SparkSQL Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills To qualify for the role, you must have Bachelor’s or equivalent degree in computer science, or related field, required. Advanced degree or equivalent business experience preferred Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analyzing large volumes of data Relevant work experience of minimum 6 to 8 years in a big 4 or technology/ consulting set up Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Experience in Snowflake What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Innovate is the brand face of INCORES Strategic Services LLP — short for Innovate Consultancy Resources . Ours is a multi-domain talent solutions partner offering HR, Technology, Staffing and Digital Marketing Services. We collaborate with leading consulting and implementation partners to deliver top-quality talent to major enterprises and organizations across India. We're a multidisciplinary consultancy delivering creative, digital, and strategic solutions to help businesses innovate, grow, and lead. We are actively assisting one of our esteemed clients in identifying a skilled Java MuleSoft Developer to support their enterprise integration initiatives. The ideal candidate will have robust experience in MuleSoft Anypoint Platform and core Java development , with the ability to design and implement scalable API-driven solutions. Job Description: Develop, test, and deploy scalable APIs and integrations using MuleSoft . Write custom logic and reusable components in Java to extend MuleSoft capabilities. Collaborate with cross-functional client teams to gather requirements and deliver solutions aligned with API-led connectivity principles. Ensure API performance, reliability, and security standards are met. Participate in code reviews, version control, and release management processes. Maintain technical documentation and support system troubleshooting for integrations in production. Follow agile methodologies, providing regular updates and participating in sprint ceremonies. Must-Have Skills & Experience: 4–6 years of professional experience as a MuleSoft Developer with strong command of Java . Hands-on experience with MuleSoft ESB , DataWeave , MUnit , and API Integration . Proven ability to work with RESTful/SOAP APIs , RAML/OpenAPI specifications , and API lifecycle management. Developing and maintaining Mulesoft and NiFi APIs, NiFi ETL, Groovy Scripting, Handson experience on API security protocols (OAuth2, JWT, HTTPS). Experience integrating with third-party applications, databases, cloud systems, and legacy systems. Familiarity with CI/CD tools such as Git, Jenkins, and Maven. Good to Have: MuleSoft Certification (MuleSoft Certified Developer – Level 1). Experience working with cloud platforms like AWS , Azure , or Google Cloud . Exposure to containerization tools (Docker, Kubernetes). Consultant Engagement Details: Engagement Type: Permanent Onboarding: Immediate or as per mutual availability Client Location: All India Mode of Work: Hybrid – based on project requirements

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Join our digital revolution in NatWest Digital X In everything we do, we work to one aim. To make digital experiences which are effortless and secure. So we organise ourselves around three principles: engineer, protect, and operate. We engineer simple solutions, we protect our customers, and we operate smarter. Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive. This role is based in India and as such all normal working days must be carried out in India. Job Description Join us as a Data Engineer This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank by developing innovative data driven solutions, using insight to be commercially successful, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support the bank’s strategic direction while building your network across the bank We're offering this role at associate level What you'll do As a Data Engineer, you’ll play a key role in driving value for our customers by building data solutions. You’ll be carrying out data engineering tasks to build, maintain, test and optimise a scalable data architecture, as well as carrying out data extractions, transforming data to make it usable to data analysts and scientists, and loading data into data platforms. You’ll also be: Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Practicing DevOps adoption in the delivery of data engineering, proactively performing root cause analysis and resolving issues Collaborating closely with core technology and architecture teams in the bank to build data knowledge and data solutions Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions Sourcing new data using the most appropriate tooling and integrating it into the overall solution to deliver for our customers The skills you'll need To be successful in this role, you’ll need a minimum of four years of experience with good understanding of data usage and dependencies with wider teams and the end customer, as well as experience of extracting value and features from large scale data. You’ll also demonstrate: Experience of ETL technical design, including data quality testing, cleansing and monitoring, and data warehousing and data modelling capabilities Proficiency in Python, PySpark, SQL, CICD pipelines, Git version control Experience in data engineering toolsets such as Airflow, RDBMs (PGSQL/Oracle/DB2), Snowflake, S3, EMR,DataBricks and Data Pipelines Experience in reporting tools such as QuickSight

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description: ETL Test Automation - Senior Test Engineer We are looking for a highly skilled and experienced ETL Test Automation as a Senior Test Engineer. Technical Expertise Experience should have 3 years to 5 years of ETL/DW test automation. Strong knowledge of ETL processes, data warehouse concepts and database testing. Experience in big data testing, focusing on both automated and manual testing for data validation. Proficient in writing complex SQL queries (preferably BigQuery) and understanding database concepts. Understanding of GCP tools: BigQuery, Dataflow, Dataplex, Cloud Storage. Ability to transform simple/complex business logic into SQL queries. Hands-on experience in Python for test automation. Familiarity with test automation frameworks. Excellent communication and client-facing skills. Experience with version control systems like GITlab and test management tools such as JIRA and confluence. Demonstrated experience working in an Agile/SCRUM environment. GCP certifications or training in cloud data engineering. Familiarity with data governance, metadata management, and data forms. Exposure to real-time/streaming data systems, including monitoring, validation, and scaling strategies. Key Responsibilities Design, execute, and maintain QA strategies for ETL/Data Warehouse workflows on Google Cloud Platform (GCP). Validate large-scale data migrations to ensure accuracy and completeness between source and target systems. Develop and maintain automation scripts using Python or any relevant automation tool. Identify, investigate and resolve data anomalies and quality issues Write and optimize complex SQL queries (preferably for BigQuery) to validate transformations, mapping, and business rules. Work closely with data engineers, architects and analysts to understand data requirements and support data quality initiatives. Collaborate in an Agile/SCRUM development environment. Perform manual and automated data validations for high-volume pipelines. Track and manage defects using JIRA and maintain transparency via Confluence.

Posted 1 week ago

Apply

2.0 years

0 - 0 Lacs

DLF Ph-II, Gurugram, Haryana

On-site

Hiring Power BI Engineer/Developer Opportunity! location - Gurgaon What you'll do: Design and develop impactful BI reports and dashboards. Manage and maintain data integrity. Collaborate with stakeholders to understand their data needs. Ensure the accuracy and reliability of all data. Troubleshoot and maintain existing BI solutions. Implement and enforce data security measures. What we're looking for: Minimum 2 years of experience in Power BI development. B.Tech /B.E. in Computer Science or Information Technology. Expert-level proficiency in Power BI. Strong knowledge of MS SQL Server, DAX, and other data languages. Experience with Data ETL/import/Export Tools and BI Integration Tools. Familiarity with ASP.NET Applications is a plus. Job Type: Full-time Pay: ₹40,000.00 - ₹55,000.00 per month Benefits: Provident Fund Schedule: Day shift Monday to Friday Ability to commute/relocate: DLF Ph-II, Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): can you appear for face to face round interview? Education: Bachelor's (Required) Experience: Power BI: 2 years (Required) Location: DLF Ph-II, Gurugram, Haryana (Required) Work Location: In person

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description Job Description: As a PM I in AMXL Supply Chain team, you will have the opportunity to solve business and customer centric problems to improve the efficiency/cost and speed of fulfilment for the Heavy & Bulky channel. In this role you will partner with stakeholders across business and operations verticals, to solve complex technical problems - preferably with simple but scaleable solutions. Supply Chain managers are take customer anecdotes seriously, are data driven, create & leverage mechanisms to create scalable solutions that eliminate ambiguities. They are comfortable working in teams having advanced analytical, mathematical, and quantitative requirements. Basic Qualifications At least 4 years of experience in a top tier company driving programs or projects. Supply Chain background preferable but not mandatory. We hire for core skills, not specializations. Bachelor’s degree in Computer Science, Physics, Mathematics, Statistics, Engineering, or similar. Ability to grasp the operational concepts of customer order flow across all miles with different volume and demand patterns. Able to manage a business that operates 24/7 and commit the time required to get the job done. Business analysis and partnership across Amazon with AFT, SME and Operations leaders to develop new operating opportunities. Regularly monitor performance markers drive continuous improvement to optimize process consistency, cost and customer experience. Ability to deal with ambiguity, take high confidence assumptions or seek help/elevate discussions wherever required to deliver results. Ability to back-up narratives and decisions with data. High degree of ownership, self-motivated and have backbone to stand-up for what is right vs. what is easy/comfortable. Able to function independently with limited guidance. You should be able to deal with varied stakeholders and earn their trust. You should have the ability to influence stakeholders upto +1 level without authority and do so by using data and facts. Working knowledge of data mining using SQL, ETL, data warehouse as well as Excel. Preferred Qualifications Strong presentation skills. Ability to independently present narratives and take Qs from stakeholders in open forums. Proficiency in VBA Macro, Quicksight/related dashboard tools, Python, R etc. You should have excellent written communication skills - data oriented & brevity being key words in your style of writing. Direct work experience in e-commerce, warehousing or delivery station operations. Basic Qualifications Bachelor's degree or equivalent Preferred Qualifications 2+ years of program or project management experience Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A3031605

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. Overview of the role The Business research Analyst will be responsible for data and Machine learning part of continuous improvement projects across compatibility and basket building space. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. In compatibility program, RA perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. RA also continuously improve the ML solution for higher solution accuracy, efficiency and scalability. RA should writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key job responsibilities Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Conducting a thorough analysis of large datasets to identify patterns, trends, and insights that can inform the development of NLP applications. Developing and implementing machine learning models and deep learning architectures to improve NLP systems. Designing and implementing core NLP tasks such as named entity recognition, classification and part-of-speech tagging. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of Python), tools for database (e.g. SQL, spark) and ML platform (tensorflow, pytorch) Conducting regular code reviews and implementing quality assurance processes to maintain high standards of code quality and performance optimization. Providing technical guidance and mentorship to junior team members and collaborating with external partners to integrate cutting-edge technologies. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Coordinates design effort between internal team and external team to develop optimal solutions for their part of project for Amazon’s network. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. About The Team Amazon.com operates in a virtual, global eCommerce environment without boundaries, and operates a diverse set of businesses in 14 countries, including Retail, third party marketplaces, eCommerce platforms, web services for developers. Retail Business Service (RBS) organization is a core part of leading customer experience and selling partners experience optimization. This team is part of RBS Customer Experience business unit. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The compatibility program handled by this team has a direct impact on customer buying decisions and online user experience. Compatibility program aims to address Customer purchase questions if two products works together, as well as reduce return due to incompatibility. Basic Qualifications Basic Qualifications Ability to analyse and then articulate business issues to a wide range of audiences using strong data, written and verbal communication skills Good mastery of BERT and other NLP frameworks such as GPT-2, XLNet, and Transformer models Experience in NLP techniques such as tokenization, parsing, lexing, named entity recognition, sentiment analysis and spellchecking Strong problem-solving skills, creativity and ability to overcome challenges SQL/ETL, Automation Tools Relevant bachelor’s degree or higher 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Be self motivated and autonomous with an ability to prioritize well, and remain focused when working within a team located in across several countries and time zones Preferred Qualifications Preferred Qualifications 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service Experience with computer vision algorithms and libraries such as OpenCV, TensorFlow, Caffe or PyTorch. Technical expertise, experience in Data science and ML Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3031496

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities ‒ Create GQL APIs by developing in GO . ‒ Assemble large, complex data sets that meet functional / non-functional business requirements. ‒ Build the plugins in GO required for ETL processes for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. ‒ Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. ‒ Keep our data separated and secure across national boundaries through multiple data centers and regions. ‒ Work with data and analytics experts to strive for greater functionality in our data systems. ‒ Manage exploratory data analysis to support database and dashboard development Key Requirements Experience: 8-10 years would be preferable. Required Skills ‒ A deep understanding of both fundamental programming concepts and the unique features of Golang. ‒ Should be well-versed in the GO language, managing pointers, as well as understanding how to use its mature standard library. ‒ Proficiency in the Go programming language, understanding of concurrent programming, and familiarity with tools like Docker and Kubernetes. ‒ Need Strong Problem-solving Abilities And Effective Communication Skills. ‒ Good knowledge of Rest APIs, databases and SQL. ‒ Understanding of index design and performance-tuning techniques ‒ Exposure to Source control like GIT, Azure DevOps ‒ Understanding of Agile methodologies (Scrum, Kanban) ‒ Experience with automated testing and coverage tools and experience with CI/CD automation tools (desirable) Personal Attributes ‒ Very good communication skills. ‒ Ability to easily fit into a distributed development team. ‒ Ability to manage timelines of multiple initiatives. ‒ Ability to articulate insights from the data and help business teams make decisions

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Who We are Looking For : We are hiring a hands-on Gen-AI Developer who has strong coding experience in C# , a solid understanding of Gen AI (Generative AI) concepts like LLMs (Large Language Models) and AI agents , and is well-versed in using Azure cloud services . This role is not for managers or theorists — we need someone who can build, test, and deploy real-world AI applications and solutions. Must-Have Skills & Expertise Programming & Frameworks C# Programming Language – Strong command and ability to build enterprise-level applications. .NET and .NET Aspire – Experience building scalable AI applications using the .NET ecosystem. AI & Gen-AI Development Experience working with LLMs (Large Language Models) such as OpenAI, GPT, Azure OpenAI, or local models. Hands-on experience with Generative AI tools and frameworks . Semantic Kernel (Microsoft) – Ability to use and integrate Semantic Kernel for building AI agents and orchestrating tasks. AI Agent Concepts Understanding of how AI agents work (multi-step reasoning, task decomposition, autonomous behavior). Ability to design, build, and optimize Agentic AI systems . Cloud Platform: Microsoft Azure Should have deployed or worked on AI solutions using the following Azure services: App Services Containers (Docker on Azure) AI Search Bot Services AI Foundry Cloud-native development and serverless architectures are a strong plus. Data Science & Machine Learning End-to-end ML pipeline development: from data ingestion , model training , fine-tuning , to deployment . Comfortable working with ML frameworks like MLflow , Kubeflow , or TFX . Experience with model fine-tuning and deployment , especially LLMs. Data & Pipelines Knowledge of building data pipelines using: Apache Airflow Apache Kafka Azure Data Factory Experience with both structured (SQL) and unstructured data (NoSQL) . Familiarity with Data Lakes , Data Warehouses , and ETL workflows. Infrastructure & DevOps Experience with: Containerization using Docker and Kubernetes . Infrastructure as Code tools like Terraform or Azure Resource Manager (ARM) . CI/CD tools like Azure DevOps , GitHub Actions , or Jenkins . Building and automating end-to-end pipelines for AI/ML models . Cloud Security & Cost Management Solid understanding of: Cloud security best practices – IAM, VPCs, firewalls, encryption, etc. Cloud cost optimization – autoscaling, efficient resource allocation, and budget tracking. Key Responsibilities Develop, test, and deploy intelligent applications using C# and Gen-AI technologies . Build and optimize AI agents using Semantic Kernel and LLMs . Create full ML/AI solutions — from data processing, model training, evaluation, to production deployment. Integrate and manage Azure AI services in enterprise solutions. Design and maintain data pipelines , model orchestration workflows, and automated deployments . Work collaboratively with cross-functional teams (data scientists, DevOps engineers, backend developers). Ensure performance optimization of deployed models and infrastructure. Maintain cloud cost efficiency and monitor infrastructure using the right tools and strategies. Follow Agile methodologies (Scrum/Kanban), participate in sprint planning, code reviews, and team stand-ups. Maintain code quality, documentation, and test coverage. Soft Skills Required Clear communication skills – You should be able to explain technical ideas to both tech and non-tech stakeholders. Collaborative mindset – You’ll work closely with DevOps, ML Engineers, and Data Scientists. Strong analytical and problem-solving skills – Able to break down complex problems into actionable steps. Self-motivated and hands-on – You enjoy coding, experimenting, and deploying real systems. Adaptable to new tools and fast-changing Gen-AI landscape . Ideal Candidate Summary Someone who can code in C# , work with Azure services , and understands AI at a hands-on level . You’ve built or deployed Gen-AI models, worked with LLMs and AI Agents , and can set up the whole stack — from data to deployment , securely and efficiently. You are not afraid to get your hands dirty with containers, pipelines, code, or model tuning.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Delhi, India

On-site

A) About the Role We are seeking a highly skilled and experienced ‘Senior Analyst-Enterprise SaaS’ to join our team, specializing in the Indian power sector. The ideal candidate will have a robust background in optimization using GAMS (General Algebraic Modelling System), machine learning algorithm development, financial modelling, and energy portfolio management. Additionally, expertise in backend development using Python and R, advanced visualization techniques with Python, Tableau, and Power BI, along with database management is required. B) Detailed expectations from the role The key responsibilities of this role will include the following: Optimization & Model Development: Developing Optimization models w.r.t. Power portfolio of State Utilities based upon Resource Adequacy guidelines using GAMS. Develop, implement, and optimize machine learning models (LSTM, XG Boost, ARIMA, SARIMA, LR, Ridge, Lasso RF etc.) for demand and price forecasting, anomaly detection etc. Utilize Python, R, TensorFlow, scikit-learn, and other libraries to build robust models. Collaborate with cross-functional teams to integrate machine learning models into production environment. Utilize EDA (Exploratory Data Analysis) and ETL (Extraction transformation and load) tools for developing machine learning models. Manage and optimize large-scale databases using SQL, NoSQL (MongoDB). Build and maintain financial models using Python, R, and Advanced Excel to assess and manage energy portfolios, risk assessment, and investment strategies. Analyse market trends, regulatory impacts, and economic factors influencing the Indian power sector using statistical techniques. System (Data) automation using VBA and Macros. Hands on experience in Short/Medium/Long Term Load Forecasting and Price Forecasting using statistical and machine learning tools. Advanced Visualization Techniques: Develop insightful and interactive visualizations using Python (Matplotlib, Seaborn), Advanced Excel, Tableau, and Power BI to support decision-making and stakeholder communication. Create and maintain dashboards for meaningful reports that monitor key performance indicators and model outputs. Optimization and Energy Management Dashboard creation C) Required skill set Developing models for Power Portfolio Optimization using GAMS Expertise in time series analysis and forecasting techniques using machine learning. Manage and optimize databases like SQL, NoSQL (MongoDB). Utilize Python libraries like Pandas, Scipy, TensorFlow, scikit-learn, and other libraries to build forecasting models. Utilize Python libraries like Matplot-lib, Seaborn and other tools to develop visualization insight from data. Proficiency in Advance Excel, Power BI, VS-Code and various tools for data analysis purpose. Preferred Skills: Understanding of electricity energy trading. Familiarity with optimization techniques for energy management. Experience with Git. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Knowledge of Indian energy sector policies, regulations, and market operations. Strong communication and collaboration skills. Client Management D) Education and Experience B. Tech / Master’s in Electrical, Energy Engineering, Computer Science, Information Technology or related fields like Statistics, Mathematics, Economics etc. with 5+ years’ experience in power sector optimization and forecasting models. Relevant experience in backend development with specialization in optimization, data science, machine learning, and database management, with a focus on the energy sector. Proficiency in GAMS, Python, R, and advanced visualization tools (Power BI and Advance Excel). Understanding of energy markets and portfolio management. E) Work Location Base location shall be New Delhi. However, the role might require the applicant to undertake travel for pursuing various opportunities. F) Remuneration Structure We offer a motivation based and competitive reward package. ************************

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Company Description Agilisium Consulting specializes in reimagining and co-developing AI-engineered business processes that are autonomous, scalable, and tailored for the Life Sciences industry. Recognized by leading independent analyst firms such as the Everest Group and ISG, and endorsed by consulting leaders like EY and technology giants such as AWS, Agilisium is known for its rigorous execution and continuous innovation. This commitment allows us to shape the future of the life sciences sector effectively. Role Description This is a full-time, on-site role for a Data Architect located in Chennai. The Data Architect will be responsible for designing and developing data architecture solutions, leading data modeling efforts, and ensuring effective data governance. Daily tasks include managing data warehousing, overseeing Extract Transform Load (ETL) processes, and collaborating with various teams to optimize data management practices. Qualifications Strong skills in Data Architecture and Data Modeling Experience with Data Governance Proficiency in Extract Transform Load (ETL) processes Expertise in Data Warehousing Excellent problem-solving and analytical skills Strong communication and teamwork abilities Bachelor's or Master's degree in Computer Science, Information Technology, or a related field Experience in the Life Sciences industry is a plus

Posted 1 week ago

Apply

6.0 - 7.0 years

0 Lacs

Chandigarh

On-site

Job Summary If you are looking for an opportunity in Technology Solutions and Development, Emerson has this exciting role for you! Candidate will be responsible to support BI tool stack specially SSIS, SSRS, Azure Synapse Analytics, ETL, SQL Server programming & Data Marts. Provide prompt support, maintenance, and development to assigned projects and other responsibilities. This team delivers technology solutions for strategic business needs, drives adoption of these services and support processes and boosts value by enhancing our customers’ experience. This role work along a hardworking and dedicated team of self-motivated professionals who share a collective passion for progress and excellence. In this Role, Your Responsibilities Will Be: Understand the business need, develop solutions, Support production systems and monitoring. Proficient in Microsoft BI tool stack and SQL server programming Perform root cause analysis on production issues, using data to diagnose bottlenecks and inefficiencies Refine and automate regular processes, track issues, and document changes. Requirements Gathering and Analysis for changes in ETL, support and performance improvements Configure and maintain database servers and processes, including monitoring daily schedule jobs, monitoring of system health and performance, to ensure high levels of performance, availability, and security. Responsible for handling tier 3 support tickets and providing resolution within defined service level agreements Writing ETL jobs to download different types of data from Oracle to SQL Server Data Marts to support newer or existing reports Implemented SSIS package configurations (Environment Variable and SQL Server) Designed and Developed reports based on requirements using SQL Server Reporting Services (SSRS) and deployed them. Extensively worked on cross-tabbed and Matrix reports using SSRS. Experience in working on Azure Services like Azure Data Factory, Azure Data Lake(Good to have) Must have skill sets in writing complex SQL and PLSQL programs and proficient in performance tuning Ontime Coordination and Status Reporting to the client. Good communication skills, with the ability to convey technical concepts to both technical and non-technical customers. Ability to work independently work within a team environment. Microsoft certifications in data-related fields are preferred. Work closely with partners from Business, IT, Users, admin, and functional managers, and other counterparts Who You Are: You show a tremendous amount of initiative in tough situations; are exceptional at spotting and seizing opportunities. You observe situational and group dynamics and select best-fit approach. You make implementation plans that allocate resources precisely. You pursue everything with energy, drive, and the need to finish. For This Role, You Will Need: The incumbent in this position would be responsible to work on various BU’s MSBI data warehouse support & projects. He/ She will provide optimum solutions using Microsoft BI tool stack specially SSIS, SSAS, SSRS, Azure Synapse Analytics, ETL, SQL Server programming & Data Marts. Provide prompt support, maintenance, and development to assigned projects and other responsibilities. 6-7 years of experience working with Microsoft BI tool stack and SQL server programming 6-7 years of experience on SSDT tools specially into SSIS, SSAS, SSRS & SQL Server Strong analytical abilities with a proven track record of resolving complex data challenges. Proficient in database management, SQL query optimization, and data mapping. Understanding of Excel, including formulas, filters, macros, pivots, and related operations. Demonstrable experience leading and managing data-driven projects or teams. Strong proficiency with data visualization tools such as Power BI, or similar tools. Experience in strategic business analysis and collaborating with senior leadership. Good problem solving and analytical skills Flexible to work in 24x7 environment Preferred Qualifications that Set You Apart: Bachelor’s degree or equivalent in Science with a technical background (MIS, Computer Science, Engineering or any related field) Good interpersonal skills using English, both spoken and written, as will be working with overseas team Our Culture & Commitment to You At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives— because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave . WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . ABOUT EMERSON Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please.

Posted 1 week ago

Apply

7.0 - 9.0 years

5 - 8 Lacs

Gurgaon

On-site

The role aims to leverage data analysis, engineering, and AI/ML techniques to drive strategic business decisions and innovations. This position is responsible for designing and implementing scalable data pipelines, developing innovative models, and managing cloud infrastructure to ensure efficient data processing and storage. The role also involves collaborating with cross-functional teams to translate business needs into technical solutions, mentoring junior team members, and staying abreast of the latest technological advancements. Effective communication, particularly in English, is essential to articulate complex insights and foster a collaborative environment. The ultimate goal is to enhance data-driven decision-making and maintain a competitive edge through continuous improvement and innovation. Data and AI Specialist, Consulting role Key Responsibilities: Python developer experienced with Azure Cloud using Azure Data bricks for Data Science : Create models and algorithms to analyze data and solve business problems Application Architecture: Knowledge of enterprise application integration and application design Cloud Management : Knowledge of hosting and supporting applications of Azure Cloud Data Engineering : Build and maintain systems to process and store data efficiently Collaboration : Work with different teams to understand their needs and provide data solutions. Share insights through reports and presentations Research : Keep up with the latest tech trends and improve existing models and systems Mentorship : Guide and support junior team members Must have: Python development in AI / ML and Data Analysis : Strong programming skills in Python or R, SQL Proficiency in statistical analysis and machine learning techniques Hands on experience in NLP and NLU Experience with data visualization and reporting tools (e.g., Power BI) Experience with Microsoft Power Platforms and SharePoint, including (e.g., Power Automate) Hands on experience if using SharePoint for content management Data Engineering : Expertise in designing and maintaining data pipelines and ETL processes Experience with data storage solutions (e.g. Azure SQL) Understanding of data quality and governance principles Experience with Databricks for big data processing and analytics Cloud Management : Proficiency in cloud platforms (e.g., Azure) Knowledge of hosting and supporting applications of Azure Cloud Knowledge of cloud security and compliance best practices Collaboration and Communication : Experience in agile methodologies and project management tools (e.g., Jira) Strong interpersonal and communication skills Ability to translate complex technical concepts into business terms Experience working in cross-functional teams Excellent English communication skills , both written and verbal Research and Development : Ability to stay updated with the latest advancements in data science, AI/ML, and cloud technologies Experience in conducting research and improving model performance Mentorship : Experience in guiding and mentoring junior team members Ability to foster a collaborative and innovative team environment Must exhibit following core behaviors : Taking ownership / accountability of the projects assigned Qualifications  Bachelor's, Master's in Computer Science, or MCA degree, Data Science, AI/ML, IT, or related fields  7-9 years of relevant experience  Proficiency in Python, R, cloud platforms (Azure), and data visualization tools like Power BI  Advanced certifications and experience with big data technologies, real-time data processing  Excellent English communication skills Job Location

Posted 1 week ago

Apply

5.0 years

0 Lacs

Haryana

On-site

Senior Data Engineer (C11) Analytics & Information Management (AIM), Gurugram Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are seeking a highly experienced and strategic Officer – Sr. Data Engineer for Data/Information Management Team. The ideal candidate will be responsible for development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of COO (Chief Operating Office). This role requires proven track record of implementing optimized data processes/platforms, delivering impactful insights, and fostering a data-driven culture. - The Data/Information Analyst accomplishes results by contributing significantly to the bank's success by leveraging data engineering & solution design skills within specialized domain. Integrates subject matter and industry expertise within a defined area. Contributes to standards around which others will operate. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Has responsibility for volume, quality, timeliness and delivery of end results of an area. Responsibilities: Incumbents would be primarily responsible for supporting Business Execution activities Chief Operating Office, implement data engineering solutions to manage banking operations. Establish monitoring routines, scorecards and escalation workflows Oversee the Data Strategy, Smart Automation, Insight Generation, Data Quality and Reporting activities using proven analytical techniques. Responsible for documenting data requirements, data collection / processing / cleaning, which may include Process Automation / Optimization and data visualization techniques. Enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, Implement a governance framework with clear stewardship roles and data quality controls Interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. Build Data Strategy by identifying all relevant product processors, create Data Lake, Data Pipeline, Governance & Reporting Communicate findings and recommendations to senior management. Stay current with the latest trends and technologies in analytics. Ensure compliance with data governance policies and regulatory requirements. Setup a governance operating framework to enable operationalization of data domains, identify CDEs and Data Quality rules. Align with Citi Data Governance Policies and firmwide Chief Data Office expectations Incumbents work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies like Centralized data repository with standardized definitions and scalable data pipes Identifies and compiles data sets using a variety of tools (e.g. SQL, Access) to help predict, improve, and measure the success of key business to business outcomes. Implement rule-based Data Quality checks across critical data points. Automate alerts for breaks and publish periodic quality reports Incumbents in this role may often be referred to as Data Analyst. Develop and execute the analytics strategy – Data Ingestion, Reporting / Insights Centralization, Ensure consistency, lineage tracking, and audit readiness across legal reporting Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Work as a senior member in a team of data engineering professionals, working with them to deliver on organizational priorities Qualifications: 5+ years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, SQL etc. Strong understanding of Data Transformation – Data Strategy, Data Architecture, Data Tracing & Lineage (ability to trace data lineage from source systems to data warehouse to reports and dashboards), Scalable Data Flow Design and Standardization, Platform Integration, ETL & Smart Automation Conceptual, logical, and physical data modeling expertise. Proficiency in relational and dimensional data modeling techniques. Ability and experience in designing data warehouses, integrated data marts, and optimized reporting schemas that cater to multiple BI tools Database Management & Optimization. Expertise in database performance tuning and optimization for data enrichment and integration, reporting and dashboarding Strong understanding of data platforms / ecosystem, establish a scalable data management framework – data provisioning, process optimization, actionable insights, visualization techniques using Tableau Solution Architect with proven ability to translate complex data flows into automated & optimized solutions. Ability to leverage data analytics tools & techniques for analytics problem solving for organizational needs Experience in Developing and Deploying AI solutions in partnership with Tech and Business Experience with any banking operations (e.g., expense analytics, movement of funds, cash flow management, fraud analytics, ROI). Knowledge of regulatory requirements related to data privacy and security Experience in interacting with senior stakeholders across the organization to be able to manage end-to-end conceptualization & implementation of data strategies - standardization data structures, identify and remove redundancies to optimize data feeds AI / Gen AI proficiency and thought leadership in Financial/Business Analysis and/or credit/risk analysis with ability to impact key business drivers via a disciplined analytic process Demonstrate Analytics thought leadership skills & project planning capabilities In-depth understanding of the various financial service business models, expert knowledge of advanced statistical techniques and how to apply the techniques to drive substantial business results Creative problem-solving skills Education: Bachelors/University degree in STEM, Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. - Time Type :Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Job Level :C11 - Job Family Group: Decision Management - Job Family: Data/Information Management - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills MicroStrategy, Python (Programming Language), Structured Query Language (SQL), Tableau (Software). - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

8.0 - 10.0 years

2 - 7 Lacs

Gurgaon

On-site

Who w e are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JC I: https:/ /www.youtube.com/watch ?v = nrbigjbpxkg A sia-Pacific L i nkedIn: https:/ /www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/ ?fee dView=all C areer: The Power Behind Your Mission O penBlue: This is How a Space Comes Alive What will you do? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases Requirement & Qualification: Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 8-10 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What we offer: We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our ded ication to d iversity a n d inclusion starts w ith ou r v a lues. W e lead w ith i n tegrity a n d p ur p o se, f o cusing o n the future a n d a ligning w ith o u r customers’ v ision for s u ccess. Our H igh-Performance Culture e n sures that w e h a v e the b e st talent that is h i gh ly e n g ag e d a n d eag e r to innovate. O u r D&I m ission e levates e a ch e m p l oye e ’ s re sponsibility to contribute to ou r culture. It’s through t he se contributions that we’ ll d r ive the m indsets an d be h a v iors w e nee d t o p o w e r o u r customers’ m issions. Y o u ha v e the p o wer . Yo u ha v e t h e v o i ce. Yo u ha v e the culture in y o u r h and s.

Posted 1 week ago

Apply

5.0 years

6 - 7 Lacs

Gurgaon

On-site

Who we are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JCI: https://www.youtube.com/watch?v=nrbigjbpxkg Asia-Pacific LinkedIn: https://www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/?feedView=all Career: The Power Behind Your Mission OpenBlue: This is How a Space Comes Alive How will you do it? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases What we look for? Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 5 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What we offer: We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our dedication to diversity and inclusion starts with our values. We lead with integrity and purpose, focusing on the future and aligning with our customers’ vision for success. Our High-Performance Culture ensures that we have the best talent that is highly engaged and eager to innovate. Our D&I mission elevates each employee’s responsibility to contribute to our culture. It’s through these contributions that we’ll drive the mindsets and behaviors we need to power our customers’ missions. You have the power. You have the voice. You have the culture in your hands

Posted 1 week ago

Apply

8.0 - 10.0 years

5 - 7 Lacs

Gurgaon

On-site

P2-C2-STS JD Strong SQL skills to perform database queries, data validations, and data integrity checks. Familiarity with relational databases and data management concepts. Working experience with cloud-based data warehouse platforms like Snowflake and AWS. Experience in creating and implementing ETL testing strategy Experience in data integrity, data accuracy and completeness testing Proficient in source to target mapping validation test cases Proficient in Test planning, Test design, Test execution, Test management, preferably in healthcare payor domain Lead ETL testing and data migration projects from QA perspective, ensuring accuracy and completeness. Validated data pipelines in order to maintaining data integrity. Performed BI report validation on Power BI for a Enterprise level Sales and Assets dashboard which has number of important KPIs, ensuring insights are accurate and actionable. Executed automation framework for data validation and reconciliation. Interact with business stakeholders and give UAT support to them during UAT cycle.Write complex SQL queries on Snowflake in order to maintain data quality. Maintain test cases on JIRA and Zephyr. Attend all the scrum ceremonies like Sprint review meetings, daily standups. Mandatory Skills 8 to10 years of ETL Testing experience Snowflake and AWS. Business intelligence and Data warehouse testing SQL queries and testing data flow across the data layers testing data quality, data integrity, data reconciliation understanding on Data warehouse working with Agile teams ETL testing strategy About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Good knowledge in GCP, BigQuery, SQL Server, Postgres DB Knowledge in Datastream, Cloud Dataflow, Terraform, ETL tool, Writing procedures and functions ,Writing dynamic code , Performance tuning and complex queries , UNIX.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies