Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. Overview of the role The Business research Analyst will be responsible for data and Machine learning part of continuous improvement projects across compatibility and basket building space. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. In compatibility program, RA perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. RA also continuously improve the ML solution for higher solution accuracy, efficiency and scalability. RA should writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key job responsibilities Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Conducting a thorough analysis of large datasets to identify patterns, trends, and insights that can inform the development of NLP applications. Developing and implementing machine learning models and deep learning architectures to improve NLP systems. Designing and implementing core NLP tasks such as named entity recognition, classification and part-of-speech tagging. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of Python), tools for database (e.g. SQL, spark) and ML platform (tensorflow, pytorch) Conducting regular code reviews and implementing quality assurance processes to maintain high standards of code quality and performance optimization. Providing technical guidance and mentorship to junior team members and collaborating with external partners to integrate cutting-edge technologies. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Coordinates design effort between internal team and external team to develop optimal solutions for their part of project for Amazon’s network. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. About The Team Amazon.com operates in a virtual, global eCommerce environment without boundaries, and operates a diverse set of businesses in 14 countries, including Retail, third party marketplaces, eCommerce platforms, web services for developers. Retail Business Service (RBS) organization is a core part of leading customer experience and selling partners experience optimization. This team is part of RBS Customer Experience business unit. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The compatibility program handled by this team has a direct impact on customer buying decisions and online user experience. Compatibility program aims to address Customer purchase questions if two products works together, as well as reduce return due to incompatibility. Basic Qualifications Basic Qualifications Ability to analyse and then articulate business issues to a wide range of audiences using strong data, written and verbal communication skills Good mastery of BERT and other NLP frameworks such as GPT-2, XLNet, and Transformer models Experience in NLP techniques such as tokenization, parsing, lexing, named entity recognition, sentiment analysis and spellchecking Strong problem-solving skills, creativity and ability to overcome challenges SQL/ETL, Automation Tools Relevant bachelor’s degree or higher 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Be self motivated and autonomous with an ability to prioritize well, and remain focused when working within a team located in across several countries and time zones Preferred Qualifications Preferred Qualifications 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service Experience with computer vision algorithms and libraries such as OpenCV, TensorFlow, Caffe or PyTorch. Technical expertise, experience in Data science and ML Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3031496
Posted 1 day ago
8.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities ‒ Create GQL APIs by developing in GO . ‒ Assemble large, complex data sets that meet functional / non-functional business requirements. ‒ Build the plugins in GO required for ETL processes for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. ‒ Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. ‒ Keep our data separated and secure across national boundaries through multiple data centers and regions. ‒ Work with data and analytics experts to strive for greater functionality in our data systems. ‒ Manage exploratory data analysis to support database and dashboard development Key Requirements Experience: 8-10 years would be preferable. Required Skills ‒ A deep understanding of both fundamental programming concepts and the unique features of Golang. ‒ Should be well-versed in the GO language, managing pointers, as well as understanding how to use its mature standard library. ‒ Proficiency in the Go programming language, understanding of concurrent programming, and familiarity with tools like Docker and Kubernetes. ‒ Need Strong Problem-solving Abilities And Effective Communication Skills. ‒ Good knowledge of Rest APIs, databases and SQL. ‒ Understanding of index design and performance-tuning techniques ‒ Exposure to Source control like GIT, Azure DevOps ‒ Understanding of Agile methodologies (Scrum, Kanban) ‒ Experience with automated testing and coverage tools and experience with CI/CD automation tools (desirable) Personal Attributes ‒ Very good communication skills. ‒ Ability to easily fit into a distributed development team. ‒ Ability to manage timelines of multiple initiatives. ‒ Ability to articulate insights from the data and help business teams make decisions
Posted 1 day ago
0 years
0 Lacs
India
On-site
Who We are Looking For : We are hiring a hands-on Gen-AI Developer who has strong coding experience in C# , a solid understanding of Gen AI (Generative AI) concepts like LLMs (Large Language Models) and AI agents , and is well-versed in using Azure cloud services . This role is not for managers or theorists — we need someone who can build, test, and deploy real-world AI applications and solutions. Must-Have Skills & Expertise Programming & Frameworks C# Programming Language – Strong command and ability to build enterprise-level applications. .NET and .NET Aspire – Experience building scalable AI applications using the .NET ecosystem. AI & Gen-AI Development Experience working with LLMs (Large Language Models) such as OpenAI, GPT, Azure OpenAI, or local models. Hands-on experience with Generative AI tools and frameworks . Semantic Kernel (Microsoft) – Ability to use and integrate Semantic Kernel for building AI agents and orchestrating tasks. AI Agent Concepts Understanding of how AI agents work (multi-step reasoning, task decomposition, autonomous behavior). Ability to design, build, and optimize Agentic AI systems . Cloud Platform: Microsoft Azure Should have deployed or worked on AI solutions using the following Azure services: App Services Containers (Docker on Azure) AI Search Bot Services AI Foundry Cloud-native development and serverless architectures are a strong plus. Data Science & Machine Learning End-to-end ML pipeline development: from data ingestion , model training , fine-tuning , to deployment . Comfortable working with ML frameworks like MLflow , Kubeflow , or TFX . Experience with model fine-tuning and deployment , especially LLMs. Data & Pipelines Knowledge of building data pipelines using: Apache Airflow Apache Kafka Azure Data Factory Experience with both structured (SQL) and unstructured data (NoSQL) . Familiarity with Data Lakes , Data Warehouses , and ETL workflows. Infrastructure & DevOps Experience with: Containerization using Docker and Kubernetes . Infrastructure as Code tools like Terraform or Azure Resource Manager (ARM) . CI/CD tools like Azure DevOps , GitHub Actions , or Jenkins . Building and automating end-to-end pipelines for AI/ML models . Cloud Security & Cost Management Solid understanding of: Cloud security best practices – IAM, VPCs, firewalls, encryption, etc. Cloud cost optimization – autoscaling, efficient resource allocation, and budget tracking. Key Responsibilities Develop, test, and deploy intelligent applications using C# and Gen-AI technologies . Build and optimize AI agents using Semantic Kernel and LLMs . Create full ML/AI solutions — from data processing, model training, evaluation, to production deployment. Integrate and manage Azure AI services in enterprise solutions. Design and maintain data pipelines , model orchestration workflows, and automated deployments . Work collaboratively with cross-functional teams (data scientists, DevOps engineers, backend developers). Ensure performance optimization of deployed models and infrastructure. Maintain cloud cost efficiency and monitor infrastructure using the right tools and strategies. Follow Agile methodologies (Scrum/Kanban), participate in sprint planning, code reviews, and team stand-ups. Maintain code quality, documentation, and test coverage. Soft Skills Required Clear communication skills – You should be able to explain technical ideas to both tech and non-tech stakeholders. Collaborative mindset – You’ll work closely with DevOps, ML Engineers, and Data Scientists. Strong analytical and problem-solving skills – Able to break down complex problems into actionable steps. Self-motivated and hands-on – You enjoy coding, experimenting, and deploying real systems. Adaptable to new tools and fast-changing Gen-AI landscape . Ideal Candidate Summary Someone who can code in C# , work with Azure services , and understands AI at a hands-on level . You’ve built or deployed Gen-AI models, worked with LLMs and AI Agents , and can set up the whole stack — from data to deployment , securely and efficiently. You are not afraid to get your hands dirty with containers, pipelines, code, or model tuning.
Posted 1 day ago
5.0 years
0 Lacs
Delhi, India
On-site
A) About the Role We are seeking a highly skilled and experienced ‘Senior Analyst-Enterprise SaaS’ to join our team, specializing in the Indian power sector. The ideal candidate will have a robust background in optimization using GAMS (General Algebraic Modelling System), machine learning algorithm development, financial modelling, and energy portfolio management. Additionally, expertise in backend development using Python and R, advanced visualization techniques with Python, Tableau, and Power BI, along with database management is required. B) Detailed expectations from the role The key responsibilities of this role will include the following: Optimization & Model Development: Developing Optimization models w.r.t. Power portfolio of State Utilities based upon Resource Adequacy guidelines using GAMS. Develop, implement, and optimize machine learning models (LSTM, XG Boost, ARIMA, SARIMA, LR, Ridge, Lasso RF etc.) for demand and price forecasting, anomaly detection etc. Utilize Python, R, TensorFlow, scikit-learn, and other libraries to build robust models. Collaborate with cross-functional teams to integrate machine learning models into production environment. Utilize EDA (Exploratory Data Analysis) and ETL (Extraction transformation and load) tools for developing machine learning models. Manage and optimize large-scale databases using SQL, NoSQL (MongoDB). Build and maintain financial models using Python, R, and Advanced Excel to assess and manage energy portfolios, risk assessment, and investment strategies. Analyse market trends, regulatory impacts, and economic factors influencing the Indian power sector using statistical techniques. System (Data) automation using VBA and Macros. Hands on experience in Short/Medium/Long Term Load Forecasting and Price Forecasting using statistical and machine learning tools. Advanced Visualization Techniques: Develop insightful and interactive visualizations using Python (Matplotlib, Seaborn), Advanced Excel, Tableau, and Power BI to support decision-making and stakeholder communication. Create and maintain dashboards for meaningful reports that monitor key performance indicators and model outputs. Optimization and Energy Management Dashboard creation C) Required skill set Developing models for Power Portfolio Optimization using GAMS Expertise in time series analysis and forecasting techniques using machine learning. Manage and optimize databases like SQL, NoSQL (MongoDB). Utilize Python libraries like Pandas, Scipy, TensorFlow, scikit-learn, and other libraries to build forecasting models. Utilize Python libraries like Matplot-lib, Seaborn and other tools to develop visualization insight from data. Proficiency in Advance Excel, Power BI, VS-Code and various tools for data analysis purpose. Preferred Skills: Understanding of electricity energy trading. Familiarity with optimization techniques for energy management. Experience with Git. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Knowledge of Indian energy sector policies, regulations, and market operations. Strong communication and collaboration skills. Client Management D) Education and Experience B. Tech / Master’s in Electrical, Energy Engineering, Computer Science, Information Technology or related fields like Statistics, Mathematics, Economics etc. with 5+ years’ experience in power sector optimization and forecasting models. Relevant experience in backend development with specialization in optimization, data science, machine learning, and database management, with a focus on the energy sector. Proficiency in GAMS, Python, R, and advanced visualization tools (Power BI and Advance Excel). Understanding of energy markets and portfolio management. E) Work Location Base location shall be New Delhi. However, the role might require the applicant to undertake travel for pursuing various opportunities. F) Remuneration Structure We offer a motivation based and competitive reward package. ************************
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description Agilisium Consulting specializes in reimagining and co-developing AI-engineered business processes that are autonomous, scalable, and tailored for the Life Sciences industry. Recognized by leading independent analyst firms such as the Everest Group and ISG, and endorsed by consulting leaders like EY and technology giants such as AWS, Agilisium is known for its rigorous execution and continuous innovation. This commitment allows us to shape the future of the life sciences sector effectively. Role Description This is a full-time, on-site role for a Data Architect located in Chennai. The Data Architect will be responsible for designing and developing data architecture solutions, leading data modeling efforts, and ensuring effective data governance. Daily tasks include managing data warehousing, overseeing Extract Transform Load (ETL) processes, and collaborating with various teams to optimize data management practices. Qualifications Strong skills in Data Architecture and Data Modeling Experience with Data Governance Proficiency in Extract Transform Load (ETL) processes Expertise in Data Warehousing Excellent problem-solving and analytical skills Strong communication and teamwork abilities Bachelor's or Master's degree in Computer Science, Information Technology, or a related field Experience in the Life Sciences industry is a plus
Posted 1 day ago
6.0 - 7.0 years
0 Lacs
Chandigarh
On-site
Job Summary If you are looking for an opportunity in Technology Solutions and Development, Emerson has this exciting role for you! Candidate will be responsible to support BI tool stack specially SSIS, SSRS, Azure Synapse Analytics, ETL, SQL Server programming & Data Marts. Provide prompt support, maintenance, and development to assigned projects and other responsibilities. This team delivers technology solutions for strategic business needs, drives adoption of these services and support processes and boosts value by enhancing our customers’ experience. This role work along a hardworking and dedicated team of self-motivated professionals who share a collective passion for progress and excellence. In this Role, Your Responsibilities Will Be: Understand the business need, develop solutions, Support production systems and monitoring. Proficient in Microsoft BI tool stack and SQL server programming Perform root cause analysis on production issues, using data to diagnose bottlenecks and inefficiencies Refine and automate regular processes, track issues, and document changes. Requirements Gathering and Analysis for changes in ETL, support and performance improvements Configure and maintain database servers and processes, including monitoring daily schedule jobs, monitoring of system health and performance, to ensure high levels of performance, availability, and security. Responsible for handling tier 3 support tickets and providing resolution within defined service level agreements Writing ETL jobs to download different types of data from Oracle to SQL Server Data Marts to support newer or existing reports Implemented SSIS package configurations (Environment Variable and SQL Server) Designed and Developed reports based on requirements using SQL Server Reporting Services (SSRS) and deployed them. Extensively worked on cross-tabbed and Matrix reports using SSRS. Experience in working on Azure Services like Azure Data Factory, Azure Data Lake(Good to have) Must have skill sets in writing complex SQL and PLSQL programs and proficient in performance tuning Ontime Coordination and Status Reporting to the client. Good communication skills, with the ability to convey technical concepts to both technical and non-technical customers. Ability to work independently work within a team environment. Microsoft certifications in data-related fields are preferred. Work closely with partners from Business, IT, Users, admin, and functional managers, and other counterparts Who You Are: You show a tremendous amount of initiative in tough situations; are exceptional at spotting and seizing opportunities. You observe situational and group dynamics and select best-fit approach. You make implementation plans that allocate resources precisely. You pursue everything with energy, drive, and the need to finish. For This Role, You Will Need: The incumbent in this position would be responsible to work on various BU’s MSBI data warehouse support & projects. He/ She will provide optimum solutions using Microsoft BI tool stack specially SSIS, SSAS, SSRS, Azure Synapse Analytics, ETL, SQL Server programming & Data Marts. Provide prompt support, maintenance, and development to assigned projects and other responsibilities. 6-7 years of experience working with Microsoft BI tool stack and SQL server programming 6-7 years of experience on SSDT tools specially into SSIS, SSAS, SSRS & SQL Server Strong analytical abilities with a proven track record of resolving complex data challenges. Proficient in database management, SQL query optimization, and data mapping. Understanding of Excel, including formulas, filters, macros, pivots, and related operations. Demonstrable experience leading and managing data-driven projects or teams. Strong proficiency with data visualization tools such as Power BI, or similar tools. Experience in strategic business analysis and collaborating with senior leadership. Good problem solving and analytical skills Flexible to work in 24x7 environment Preferred Qualifications that Set You Apart: Bachelor’s degree or equivalent in Science with a technical background (MIS, Computer Science, Engineering or any related field) Good interpersonal skills using English, both spoken and written, as will be working with overseas team Our Culture & Commitment to You At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives— because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave . WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . ABOUT EMERSON Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please.
Posted 1 day ago
7.0 - 9.0 years
5 - 8 Lacs
Gurgaon
On-site
The role aims to leverage data analysis, engineering, and AI/ML techniques to drive strategic business decisions and innovations. This position is responsible for designing and implementing scalable data pipelines, developing innovative models, and managing cloud infrastructure to ensure efficient data processing and storage. The role also involves collaborating with cross-functional teams to translate business needs into technical solutions, mentoring junior team members, and staying abreast of the latest technological advancements. Effective communication, particularly in English, is essential to articulate complex insights and foster a collaborative environment. The ultimate goal is to enhance data-driven decision-making and maintain a competitive edge through continuous improvement and innovation. Data and AI Specialist, Consulting role Key Responsibilities: Python developer experienced with Azure Cloud using Azure Data bricks for Data Science : Create models and algorithms to analyze data and solve business problems Application Architecture: Knowledge of enterprise application integration and application design Cloud Management : Knowledge of hosting and supporting applications of Azure Cloud Data Engineering : Build and maintain systems to process and store data efficiently Collaboration : Work with different teams to understand their needs and provide data solutions. Share insights through reports and presentations Research : Keep up with the latest tech trends and improve existing models and systems Mentorship : Guide and support junior team members Must have: Python development in AI / ML and Data Analysis : Strong programming skills in Python or R, SQL Proficiency in statistical analysis and machine learning techniques Hands on experience in NLP and NLU Experience with data visualization and reporting tools (e.g., Power BI) Experience with Microsoft Power Platforms and SharePoint, including (e.g., Power Automate) Hands on experience if using SharePoint for content management Data Engineering : Expertise in designing and maintaining data pipelines and ETL processes Experience with data storage solutions (e.g. Azure SQL) Understanding of data quality and governance principles Experience with Databricks for big data processing and analytics Cloud Management : Proficiency in cloud platforms (e.g., Azure) Knowledge of hosting and supporting applications of Azure Cloud Knowledge of cloud security and compliance best practices Collaboration and Communication : Experience in agile methodologies and project management tools (e.g., Jira) Strong interpersonal and communication skills Ability to translate complex technical concepts into business terms Experience working in cross-functional teams Excellent English communication skills , both written and verbal Research and Development : Ability to stay updated with the latest advancements in data science, AI/ML, and cloud technologies Experience in conducting research and improving model performance Mentorship : Experience in guiding and mentoring junior team members Ability to foster a collaborative and innovative team environment Must exhibit following core behaviors : Taking ownership / accountability of the projects assigned Qualifications Bachelor's, Master's in Computer Science, or MCA degree, Data Science, AI/ML, IT, or related fields 7-9 years of relevant experience Proficiency in Python, R, cloud platforms (Azure), and data visualization tools like Power BI Advanced certifications and experience with big data technologies, real-time data processing Excellent English communication skills Job Location
Posted 1 day ago
5.0 years
0 Lacs
Haryana
On-site
Senior Data Engineer (C11) Analytics & Information Management (AIM), Gurugram Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are seeking a highly experienced and strategic Officer – Sr. Data Engineer for Data/Information Management Team. The ideal candidate will be responsible for development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of COO (Chief Operating Office). This role requires proven track record of implementing optimized data processes/platforms, delivering impactful insights, and fostering a data-driven culture. - The Data/Information Analyst accomplishes results by contributing significantly to the bank's success by leveraging data engineering & solution design skills within specialized domain. Integrates subject matter and industry expertise within a defined area. Contributes to standards around which others will operate. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Has responsibility for volume, quality, timeliness and delivery of end results of an area. Responsibilities: Incumbents would be primarily responsible for supporting Business Execution activities Chief Operating Office, implement data engineering solutions to manage banking operations. Establish monitoring routines, scorecards and escalation workflows Oversee the Data Strategy, Smart Automation, Insight Generation, Data Quality and Reporting activities using proven analytical techniques. Responsible for documenting data requirements, data collection / processing / cleaning, which may include Process Automation / Optimization and data visualization techniques. Enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, Implement a governance framework with clear stewardship roles and data quality controls Interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. Build Data Strategy by identifying all relevant product processors, create Data Lake, Data Pipeline, Governance & Reporting Communicate findings and recommendations to senior management. Stay current with the latest trends and technologies in analytics. Ensure compliance with data governance policies and regulatory requirements. Setup a governance operating framework to enable operationalization of data domains, identify CDEs and Data Quality rules. Align with Citi Data Governance Policies and firmwide Chief Data Office expectations Incumbents work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies like Centralized data repository with standardized definitions and scalable data pipes Identifies and compiles data sets using a variety of tools (e.g. SQL, Access) to help predict, improve, and measure the success of key business to business outcomes. Implement rule-based Data Quality checks across critical data points. Automate alerts for breaks and publish periodic quality reports Incumbents in this role may often be referred to as Data Analyst. Develop and execute the analytics strategy – Data Ingestion, Reporting / Insights Centralization, Ensure consistency, lineage tracking, and audit readiness across legal reporting Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Work as a senior member in a team of data engineering professionals, working with them to deliver on organizational priorities Qualifications: 5+ years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, SQL etc. Strong understanding of Data Transformation – Data Strategy, Data Architecture, Data Tracing & Lineage (ability to trace data lineage from source systems to data warehouse to reports and dashboards), Scalable Data Flow Design and Standardization, Platform Integration, ETL & Smart Automation Conceptual, logical, and physical data modeling expertise. Proficiency in relational and dimensional data modeling techniques. Ability and experience in designing data warehouses, integrated data marts, and optimized reporting schemas that cater to multiple BI tools Database Management & Optimization. Expertise in database performance tuning and optimization for data enrichment and integration, reporting and dashboarding Strong understanding of data platforms / ecosystem, establish a scalable data management framework – data provisioning, process optimization, actionable insights, visualization techniques using Tableau Solution Architect with proven ability to translate complex data flows into automated & optimized solutions. Ability to leverage data analytics tools & techniques for analytics problem solving for organizational needs Experience in Developing and Deploying AI solutions in partnership with Tech and Business Experience with any banking operations (e.g., expense analytics, movement of funds, cash flow management, fraud analytics, ROI). Knowledge of regulatory requirements related to data privacy and security Experience in interacting with senior stakeholders across the organization to be able to manage end-to-end conceptualization & implementation of data strategies - standardization data structures, identify and remove redundancies to optimize data feeds AI / Gen AI proficiency and thought leadership in Financial/Business Analysis and/or credit/risk analysis with ability to impact key business drivers via a disciplined analytic process Demonstrate Analytics thought leadership skills & project planning capabilities In-depth understanding of the various financial service business models, expert knowledge of advanced statistical techniques and how to apply the techniques to drive substantial business results Creative problem-solving skills Education: Bachelors/University degree in STEM, Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. - Time Type :Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Job Level :C11 - Job Family Group: Decision Management - Job Family: Data/Information Management - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills MicroStrategy, Python (Programming Language), Structured Query Language (SQL), Tableau (Software). - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 day ago
8.0 - 10.0 years
2 - 7 Lacs
Gurgaon
On-site
Who w e are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JC I: https:/ /www.youtube.com/watch ?v = nrbigjbpxkg A sia-Pacific L i nkedIn: https:/ /www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/ ?fee dView=all C areer: The Power Behind Your Mission O penBlue: This is How a Space Comes Alive What will you do? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases Requirement & Qualification: Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 8-10 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What we offer: We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our ded ication to d iversity a n d inclusion starts w ith ou r v a lues. W e lead w ith i n tegrity a n d p ur p o se, f o cusing o n the future a n d a ligning w ith o u r customers’ v ision for s u ccess. Our H igh-Performance Culture e n sures that w e h a v e the b e st talent that is h i gh ly e n g ag e d a n d eag e r to innovate. O u r D&I m ission e levates e a ch e m p l oye e ’ s re sponsibility to contribute to ou r culture. It’s through t he se contributions that we’ ll d r ive the m indsets an d be h a v iors w e nee d t o p o w e r o u r customers’ m issions. Y o u ha v e the p o wer . Yo u ha v e t h e v o i ce. Yo u ha v e the culture in y o u r h and s.
Posted 1 day ago
5.0 years
6 - 7 Lacs
Gurgaon
On-site
Who we are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JCI: https://www.youtube.com/watch?v=nrbigjbpxkg Asia-Pacific LinkedIn: https://www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/?feedView=all Career: The Power Behind Your Mission OpenBlue: This is How a Space Comes Alive How will you do it? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases What we look for? Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 5 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What we offer: We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our dedication to diversity and inclusion starts with our values. We lead with integrity and purpose, focusing on the future and aligning with our customers’ vision for success. Our High-Performance Culture ensures that we have the best talent that is highly engaged and eager to innovate. Our D&I mission elevates each employee’s responsibility to contribute to our culture. It’s through these contributions that we’ll drive the mindsets and behaviors we need to power our customers’ missions. You have the power. You have the voice. You have the culture in your hands
Posted 1 day ago
8.0 - 10.0 years
5 - 7 Lacs
Gurgaon
On-site
P2-C2-STS JD Strong SQL skills to perform database queries, data validations, and data integrity checks. Familiarity with relational databases and data management concepts. Working experience with cloud-based data warehouse platforms like Snowflake and AWS. Experience in creating and implementing ETL testing strategy Experience in data integrity, data accuracy and completeness testing Proficient in source to target mapping validation test cases Proficient in Test planning, Test design, Test execution, Test management, preferably in healthcare payor domain Lead ETL testing and data migration projects from QA perspective, ensuring accuracy and completeness. Validated data pipelines in order to maintaining data integrity. Performed BI report validation on Power BI for a Enterprise level Sales and Assets dashboard which has number of important KPIs, ensuring insights are accurate and actionable. Executed automation framework for data validation and reconciliation. Interact with business stakeholders and give UAT support to them during UAT cycle.Write complex SQL queries on Snowflake in order to maintain data quality. Maintain test cases on JIRA and Zephyr. Attend all the scrum ceremonies like Sprint review meetings, daily standups. Mandatory Skills 8 to10 years of ETL Testing experience Snowflake and AWS. Business intelligence and Data warehouse testing SQL queries and testing data flow across the data layers testing data quality, data integrity, data reconciliation understanding on Data warehouse working with Agile teams ETL testing strategy About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Good knowledge in GCP, BigQuery, SQL Server, Postgres DB Knowledge in Datastream, Cloud Dataflow, Terraform, ETL tool, Writing procedures and functions ,Writing dynamic code , Performance tuning and complex queries , UNIX.
Posted 1 day ago
5.0 years
8 - 10 Lacs
Hyderābād
On-site
Overview: The purpose of the role is to execute Salesforce marketing cloud and data cloud development and administration activities for multiple consumer centric projects. The developer is also required to proactively monitor and report on functioning of various automations and journeys within the DevSecOps framework. Responsibilities: Create and manage roles and permissions. Create and manage users/accesses. Create and manage FTP users and directories, Handle SFTP key management. Setup "Sender Authentication Package (SAP)", Manage "send classifications". Manage Audit trail, Manage data import, and export in SFMC. Create reports and send analytics. Create and monitor file drop and schedule automation. Track latest release impacts, install packages, and set up new SFMC BUs. Develop and maintain Cloud Pages using SSJS and AMPScript. Strong Knowledge on HTML / CSS and designing Emails through Content Builder Write and optimize SQL queries for data segmentation and transformation. Design and monitor file drop automations and scheduled journeys. Work with data integration technologies to design and implement new solutions and processes for clients to support new customer experience strategies Identify data points required to enable use cases Analyze data for usability in CDP Identify data cleansing requirements Map data sources to CDP data objects and attributes (which requires good knowledge of the CDP data model) Identify the appropriate primary and secondary keys for each object Determine the specific unification algorithms required for the data of a given source Design test data to test the data model and unification algorithms Execute complex system tests. Qualifications: Bachelor's degree in Engineering with 5+ years of experience in the IT industry 2+ years of hands-on experience with Salesforce Marketing Cloud Proficiency in AMPScript, SSJS, and SQL within SFMC. 1+ year of experience on administrator tasks on one of the Salesforce clouds. 1+ year of implementation experience for on a Customer Data Platform (Salesforce CDP/Data Cloud preferred) Good oral and written communication skills with ability to collaborate in an agile environment Preferred: 2+ years of experience with data management, data transformation, ETL, preferably using cloud-based tools/infrastructure. Attitude for taking on technical challenges. Salesforce certifications (e.g., Marketing Cloud Consultant, Administrator, Advanced Administrator, Service Cloud Consultant) are a strong advantage.
Posted 1 day ago
130.0 years
2 - 7 Lacs
Hyderābād
On-site
Job Description Manager- Site Reliability Engineer (SRE) – Reliability & Automation The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centres focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centres are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Centre helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centres. Role Overview : We are looking for a dedicated Site Reliability Engineer (SRE) to ensure the reliability, scalability and operational excellence of our data applications hosted on AWS and traditional datacentres. You will own release management, automate infrastructure and deployment processes using Python, implement observability solutions and enforce compliance standards to maintain robust and highly available systems. What will you do in this role: Reliability Engineering: Design, implement and maintain systems that ensure high availability, fault tolerance and scalability of data applications across cloud (AWS) and on-premises environments. Release & Deployment Management: Manage and automate release pipelines, coordinate deployments and ensure smooth rollouts with minimal downtime. DevOps Automation: Develop and maintain automation scripts and tools (primarily in Python) to streamline infrastructure provisioning, configuration management and operational workflows. Observability & Monitoring: Build and enhance monitoring, logging and alerting systems to proactively detect and resolve system issues, ensuring optimal performance and uptime. Ensure reliability and scalability of ETL pipelines, including orchestration, scheduling, and dependency management. Automate deployment, rollback, and version control of ETL jobs and workflows. Implement monitoring and alerting for ETL job success, latency, and data quality metrics. Collaborate with data engineering teams to troubleshoot ETL failures and optimize pipeline performance. Maintain documentation and compliance related to ETL data lineage and processing Incident Management & Root Cause Analysis: Lead incident response efforts, perform thorough root cause analysis and implement preventive measures to avoid recurrence. Compliance & Security: Ensure systems comply with organizational policies and regulatory requirements, including data governance, security best practices and audit readiness. Collaboration: Work closely with development, data engineering and operations teams to align reliability goals with business objectives and support continuous improvement. Documentation & Knowledge Sharing: Maintain clear documentation of infrastructure, processes and incident reports to facilitate team knowledge and operational consistency. Monitoring & Troubleshooting: Implement and maintain monitoring and alerting for database health, query performance, and resource utilization; lead troubleshooting and root cause analysis for database-related incidents. What should you have: Bachelor’s degree in computer science, Engineering, Information Technology, or related field. 4+ years of experience in Site Reliability Engineering, DevOps or related roles focused on infrastructure reliability and automation. Strong proficiency in Python for automation and scripting. Experience with ETL orchestration tools such as Apache Airflow, AWS Glue, or similar. Understanding of data pipeline architectures and common failure modes. Familiarity with data quality and lineage concepts in ETL processes. Hands-on experience with AWS cloud services (IAM, S3, Lambda, CloudWatch, AirFlow, etc.) and traditional datacenter environments. Expertise in release management and CI/CD pipelines using tools such as Jenkins, GitLab CI, or similar. Deep knowledge of observability tools and frameworks (e.g., Prometheus, Grafana, ELK stack, Datadog). Solid understanding of infrastructure as code (IaC) tools like Terraform, CloudFormation or Ansible. Experience with container orchestration platforms (e.g., Kubernetes, Docker Swarm) is a plus. Strong incident management skills with focus on root cause analysis and remediation. Familiarity with compliance frameworks and security best practices in cloud and on-prem environments. Excellent communication skills to collaborate effectively across technical and non-technical teams. Preferred Qualifications Advanced degree in a relevant technical field. Certifications such as AWS Certified DevOps Engineer, ITIL V3/4 or similar. Experience working in Agile Scrum or Kanban environments. Knowledge of security and database administration in cloud and hybrid environments. Why Join Us? Play a critical role in ensuring the reliability and scalability of mission-critical data applications. Work with cutting-edge cloud and on-premises technologies. Collaborate with a passionate team focused on operational excellence. Opportunities for professional growth and continuous learning. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYD IT 2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Availability Management, Capacity Management, Change Controls, Configuration Management (CM), Design Applications, Incident Management, Information Technology (IT) Infrastructure, IT Service Management (ITSM), Software Configurations, Software Development, Software Development Life Cycle (SDLC), Solution Architecture, System Administration, System Designs Preferred Skills: Job Posting End Date: 09/15/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R359245
Posted 1 day ago
4.0 years
5 - 7 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking a Senior Information Security Engineer. In this role, you will: Lead or participate in computer security incident response activities for moderately complex events Conduct technical investigation of security related incidents and post incident digital forensics to identify causes and recommend future mitigation strategies Provide security consulting on medium projects for internal clients to ensure conformity with corporate information, security policy, and standards Design, document, test, maintain, and provide issue resolution recommendations for moderately complex security solutions related to networking, cryptography, cloud, authentication and directory services, email, internet, applications, and endpoint security Review and correlate security logs Utilize subject matter knowledge in industry leading security solutions and best practices to implement one or more components of information security such as availability, integrity, confidentiality, risk management, threat identification, modeling, monitoring, incident response, access management, and business continuity Identify security vulnerabilities and issues, perform risk assessments, and evaluate remediation alternatives Collaborate and consult with peers, colleagues and managers to resolve issues and achieve goals Required Qualifications: 4+ years of Information Security Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 6+ years' experience in development and automation testing methodologies Strong hands-on experience with Playwright for UI automation. Deep understanding of Power BI and Tableau dashboards and architecture. Proficiency in JavaScript/TypeScript, Python, or C#. Solid SQL skills for data validation and backend testing. Experience in test automation frameworks and tools. Familiarity with CI/CD tools like Azure DevOps, GitHub Actions, or Jenkins. Excellent analytical and problem-solving skills. Strong communication and stakeholder management abilities. Ability to work independently and lead testing initiatives Certifications in Power BI, Tableau, or Test Automation. Experience with cloud platforms (Azure, AWS, GCP). Exposure to Agile/Scrum methodologies. Job Expectations: Automation Strategy & Framework Development: Design and implement scalable automation frameworks using Playwright for dashboard testing. Develop reusable test components and libraries for UI and data validation. Validate visualizations, filters, drilldowns, and data accuracy in Power BI and Tableau dashboards. Automate regression and smoke tests for dashboard releases. Perform data integrity checks between source systems and dashboards. Write SQL queries to validate ETL pipelines and data transformations. Development & Scripting: Develop custom scripts and utilities in JavaScript/TypeScript, Python, or C# to support automation and testing. Collaborate with developers to integrate testing into CI/CD pipelines. Work closely with BI developers, data engineers, and business analysts to understand requirements. Mentor junior team members and contribute to best practices in dashboard automation. Document test cases, automation scripts, and test results. Provide detailed defect reports and collaborate on root cause analysis. Posting End Date: 7 Aug 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 1 day ago
5.0 years
0 Lacs
Hyderābād
On-site
DESCRIPTION Amazon’s ROW (Rest of World) Supply Chain Analytics team is looking for talented Business Intelligence Engineers who develop solutions to better manage/optimize speed and operations planning while providing the best experience to our customers at the lowest possible price. Our team members have an opportunity to be at the forefront of supply chain thought leadership by working on some of the most difficult problems with some of the best research scientists, product/program managers, software developers and business leaders in the industry, shaping our roadmap to drive real impact on Amazon's long-term profitability. We are an agile team, building new analysis from ground up, proposing new concepts and technology to meet business needs, and enjoy and excel at diving into data to analyze root causes and implement long-term solutions. As a BIE within the group, you will analyze massive data sets, identify areas to improve, define metrics to measure and monitor programs, build models to predict and optimize and most importantly work with different stakeholders to drive improvements over time. You will also work closely with internal business teams to extract or mine information from our existing systems to create new analysis, build analytical products and cause impact across wider teams in intuitive ways. This position provides opportunities to influence high visibility/high impact areas in the organization. They are right a lot, work very efficiently, and routinely deliver results on time. They have a global view of the analytical and/or science solutions that they build and consistently think in terms of automating, expanding, and scaling the results broadly. This position also requires you to work across a variety of teams, including transportation, operations, finance, delivery experience, people experience and platform (software) teams. Successful candidates must thrive in fast-paced environments which encourage collaborative and creative problem solving, be able to measure and estimate risks, constructively critique peer research, extract and manipulate data across various data marts, and align research focuses on Amazon’s strategic needs. We are looking for people with a flair for recognizing trends and patterns while correlating it to the business problem at hand. If you have an uncanny ability to decipher the exact policy/mechanism/solution to address the challenge and ability to influence folks using hard data (and some tact) then we are looking for you! Key job responsibilities Analysis of historical data to identify trends and support decision making, including written and verbal presentation of results and recommendations Collaborating with product and software development teams to implement analytics systems and data structures to support large-scale data analysis and delivery of analytical and machine learning models Mining and manipulating data from database tables, simulation results, and log files Identifying data needs and driving data quality improvement projects Understanding the broad range of Amazon’s data resources, which to use, how, and when Thought leadership on data mining and analysis Modeling complex/abstract problems and discovering insights and developing solutions/products using statistics, data mining, science/machine-learning and visualization techniques Helping to automate processes by developing deep-dive tools, metrics, and dashboards to communicate insights to the business teams Collaborating effectively with internal end-users, cross-functional software development teams, and technical support/sustaining engineering teams to solve problems and implement new solutions About the team ROW (Rest of World) Supply Chain analytics team is hiring multiple BIE roles in speed, planning, inbound and SNOP functions. The role will be responsible for generating insights, defining metrics to measure and monitor, building analytical products, automation and self-serve and overall driving business improvements. The role involves combination of data-analysis, visualization, statistics, scripting, a bit of machine learning and usage of AWS services. BASIC QUALIFICATIONS 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience with forecasting and statistical analysis Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 day ago
3.0 years
4 - 8 Lacs
Hyderābād
On-site
Job title : Analyst - Data & Process Management Location : Hyderabad/Mumbai % of travel expected : Travel required as per business need, if any Job type : Permanent and Full time About the job Our Team: Sanofi Business Operations (SBO) is an internal Sanofi resource organization based in India and is setup to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions. SBO strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi, globally. Main responsibilities: The overall purpose and main responsibilities are listed below: Ensuring the accuracy, security, and accessibility of organizational data. Identify and resolve data inconsistencies, redundancies, and quality issues. Maintain requirements documents, business rules and metadata. Collaborate across departments to streamline data processes, implement governance frameworks, and provide insights that drive informed decision-making. People: Maintain effective relationship with the end stakeholders within the allocated GBU and tasks – with an end objective to develop education and communication content as per requirement. Actively lead and develop SBO operations associates and ensure new technologies are leveraged. Initiate the contracting process and related documents within defined timelines; and Collaborate with global stakeholders for project planning and setting up the timelines and maintaining budget. Performance: Refresh Existing Reports and identifies improvement opportunities in reporting and BI tools and as needed structure / functionality with the latest insights as they become available. Create dashboards to synthesize and visualize key information and enable business decisions. Works to develop deal tracking analytics and reporting capabilities. Collaborates with Digital to enhance data access across various sources, develop tools, technology, and process to constantly improve quality and productivity. Collect, organize, and maintain datasets to ensure accuracy, completeness, and consistency. Monitor data pipelines, ETL processes, and ensure the smooth flow of data across systems. Develop and enforce data quality standards, governance policies, and best practices. Analyse data to identify trends, patterns, and insights that support decision-making. Build and maintain dashboards and reports using BI tools (e.g., Tableau, Power BI). Provide ad hoc data analysis for various departments and stakeholders. Performance indicators: Adherence to timeline, quality targe Process: Support delivery of projects in terms of resourcing, coordination, quality, timeliness, efficiency, and high technical standards for deliveries made by the medical/ field writing group, including scientific documents and clinical/medical reports. Contribute to overall quality enhancement by ensuring high scientific standards for the output produced by the medical/ field writing group Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards Refresh report on frequency/cycle basis (weekly/monthly/quarterly/annually), along with QC checks for each refresh. Ability to work cross-functionally, gather requirements, analyse data, and generate insights and reports that can be used by the GBU. Stakeholder: Work closely with global teams and/ external vendors to ensure the end-to-end effective project delivery of the designated publication/medical education deliverables also to prioritize work and deliver on time-sensitive requests. Be able to provide and defend gathered intelligence, methodology, content, and conclusions to the global leadership in a clear, concise format. About you Experience :3+ years of experience in pharma data management, data governance and data stewardship. In-depth knowledge of common databases like IQVIA, APLD, SFMC, Google analytics, Engagement, and execution data, etc. Soft skills : Strong learning agility; Ability to manage ambiguous environments, and to adapt to changing needs of the business; Good interpersonal and communication skills; strong presentation skills a must; Team player who is curious, dynamic, result oriented and can work collaboratively; Ability to think strategically in an ambiguous environment; Ability to operate effectively in an international matrix environment, with ability to work across time zones; Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Technical skills : Bachelor’s degree in Life Sciences, Pharmacy, Data Science, Computer Science, or a related field. A master’s degree is preferred. 3–5 years of hands-on experience in pharmaceutical data and data management, with a focus on syndicated data, Specialty Pharmacy, and digital/multi-channel data. Strong technical expertise in tools and platforms such as AWS, Snowflake, Databricks, SQL, Python, and Informatica. Solid knowledge of pharmaceutical sales and marketing data sources like IQVIA and Veeva. Familiarity with pharmaceutical sales operations and the application of data within a pharmaceutical commercial operations context. Ability to translate business requirements into detailed data solutions. Familiarity with data governance and stewardship practices, ensuring data quality and compliance. Experience with analytical tools such as Power BI, VBA, and Alteryx is a plus. Ability to contribute to driving innovation and automation by leveraging advanced analytical and statistical techniques. Education : Advanced degree in areas such as Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e.g., PhD / MBA / Masters) Languages : Excellent knowledge in English and strong communication skills – written and spoken Why chose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Play an instrumental part in creating best practice within our Go-to-Market Capabilities. Pursue progress, discover extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people. At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! null
Posted 1 day ago
4.0 years
0 Lacs
Hyderābād
On-site
Overview: We are seeking a Platform Architect with expertise in Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS) to design, implement, and optimize enterprise-level data integration platforms. The ideal candidate will have a strong background in ETL/ELT architecture, cloud data integration, and platform modernization, ensuring scalability, security, and performance across on-prem and cloud environments. Responsibilities: Platform Engineering & Administration Oversee installation, configuration, and optimization of PowerCenter and IICS environments. Manage platform scalability, performance tuning, and troubleshooting. Implement data governance, security, and compliance (e.g., role-based access, encryption, data lineage tracking). Optimize connectivity and integrations with various sources (databases, APIs, cloud storage, SaaS apps). Cloud & Modernization Initiatives Architect and implement IICS-based data pipelines for real-time and batch processing. Migrate existing PowerCenter workflows to IICS, leveraging serverless and cloud-native features. Ensure seamless integration with cloud platforms (AWS, Azure, GCP) and modern data lakes/warehouses (Snowflake, Redshift, BigQuery). Qualifications: 4 years of experience in data integration and ETL/ELT architecture. Expert-level knowledge of Informatica PowerCenter and IICS (Cloud Data Integration, API & Application Integration, Data Quality). Hands-on experience with cloud platforms (AWS, Azure, GCP) and modern data platforms (Snowflake, Databricks, Redshift, BigQuery). Strong SQL, database tuning, and performance optimization skills. Deep understanding of data governance, security, and compliance best practices. Experience in automation, DevOps (CI/CD), and Infrastructure-as-Code (IaC) tools for data platforms. Excellent communication, leadership, and stakeholder management skills. Preferred Qualifications Informatica certifications (IICS, PowerCenter, Data Governance). Proficient to Power Center to IDMC Conversions Understanding on real-time streaming (Kafka, Spark Streaming). Knowledge of API-based integration and event-driven architectures. Familiarity with Machine Learning and AI-driven data processing.
Posted 1 day ago
1.0 years
4 - 6 Lacs
Hyderābād
On-site
DESCRIPTION Amazon is a place where data drives most of our decision-making. Analytics, Operations & Programs (AOP) team is looking for a dynamic data engineer who can be innovative, strong problem solver and can lead the implementation of the analytical data infrastructure that will guide the decision making. As a Data Engineer, you think like an entrepreneur, constantly innovating and driving positive change, but more importantly, you consistently deliver mind-boggling results. You're a leader, who uses both quantitative and qualitative methods to get things done. And on top of it all, you're someone who wonders "What if?" and then seeks out the solution. This position offers exceptional opportunities to grow their technical and non-technical skills. You have the opportunity to really make a difference to our business by inventing, enhancing and building world class systems, delivering results, working on exciting and challenging projects. As a Data Engineer, you are responsible for analyzing large amounts of business data, solve real world problems, and develop metrics and business cases that will enable us to continually delight our customers worldwide. This is done by leveraging data from various platforms such as Jira, Portal, Salesforce. You will work with a team of Product Managers, Software Engineers and Business Intelligence Engineers to automate and scale the analysis, and to make the data more actionable to manage business at scale. You will own many large datasets, implement new data pipelines that feed into or from critical data systems at Amazon. You must be able to prioritize and work well in an environment with competing demands. Successful candidates will bring strong technical abilities combined with a passion for delivering results for customers, internal and external. This role requires a high degree of ownership and a drive to solve some of the most challenging data and analytic problems in retail. Candidates must have demonstrated ability to manage large-scale data modeling projects, identify requirements and tools, build data warehousing solutions that are explainable and scalable. In addition to the technical skills, a successful candidate will possess strong written and verbal communication skills and a high intellectual curiosity with ability to learn new concepts/frameworks and technology rapidly as changes arise. Key job responsibilities Design, implement and support an analytical data infrastructure Managing AWS resources including EC2, EMR, S3, Glue, Redshift, etc. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency Collaborate with Data Scientists and Business Intelligence Engineers (BIEs) to recognize and help adopt best practices in reporting and analysis Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Maintain internal reporting platforms/tools including troubleshooting and development. Interact with internal users to establish and clarify requirements in order to develop report specifications. Work with Engineering partners to help shape and implement the development of BI infrastructure including Data Warehousing, reporting and analytics platforms. Contribute to the development of the BI tools, skills, culture and impact. Write advanced SQL queries and Python code to develop solutions A day in the life This role requires you to live at the intersection of data, software, and analytics. We leverage a comprehensive suite of AWS technologies, with key tools including S3, Redshift, DynamoDB, Lambda, API's, Glue. You will drive the development process from design to release. Managing data ingestion from heterogeneous data sources, with automated data quality checks. Creating scalable data models for effective data processing, storage, retrieval, and archiving. Using scripting for automation and tool development, which is scalable, reusable, and maintainable. Providing infrastructure for self serve analytics and science use cases. Using industry best practices in building CI/CD pipelines About the team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams BASIC QUALIFICATIONS 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 day ago
8.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview PepsiCo Data BI & Integration Platforms is seeking a Midlevel Cloud Platform technology leader, responsible for overseeing the deployment, and maintenance of big data and analytics cloud infrastructure projects on AZURE/AWS for its North America PepsiCo Foods/Beverages business. The ideal candidate will have hands-on experience with AZURE/AWS services - Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Implement and support application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Spearheads automation of routine tasks across cloud infrastructure using tools like Terraform, PowerShell, or Python—minimizing manual intervention and improving consistency. Champions toil reduction strategies, implementing self-service workflows and auto-remediation to boost system resilience and free up resource capacity. Drive toil reduction initiatives by leveraging infrastructure-as-code, scripting, and self-healing solutions to enhance system reliability and agility. Implement cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based infrastructure using AZURE/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AZURE/AWS). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using TF, Python, or AZURE/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications Bachelor’s degree in computer science. At least 8 to 10 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 6 years in a technical leadership role Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in AZURE/AWS big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Deep expertise in AZURE/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proven experience managing cloud infrastructure (Azure, AWS, or GCP) with strong proficiency in automation tools like Terraform, Ansible, or PowerShell. Solid understanding of DevOps practices, CI/CD pipelines, and infrastructure-as-code to streamline operations and reduce manual effort. Ability to design and implement toil-reduction strategies, including self-healing systems, observability, and proactive alerting mechanisms. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in AZURE/AWS platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.
Posted 1 day ago
5.0 years
2 - 3 Lacs
Hyderābād
On-site
Job Title: Data EngineerExperience Level: 5+ YearsLocation: Hyderabad Job Summary We are looking for a seasoned and innovative Senior Data Engineer to join our dynamic data team. This role is ideal for professionals with a strong foundation in data engineering, coupled with hands-on experience in machine learning workflows, statistical analysis, and big data technologies. You will play a critical role in building scalable data pipelines, enabling advanced analytics, and supporting data science initiatives. Proficiency in Python is essential, and experience with PySpark is a strong plus. Key Responsibilities Data Pipeline Development: Design and implement scalable, high-performance ETL/ELT pipelines using Python and PySpark. ML & Statistical Integration: Collaborate with data scientists to integrate machine learning models and statistical analysis into data workflows. Data Modeling: Create and optimize data models (relational, dimensional, and columnar) to support analytics and ML use cases. Big Data Infrastructure: Manage and optimize data platforms such as Snowflake, Redshift, BigQuery, and Databricks. Performance Tuning: Monitor and enhance the performance of data pipelines and queries. Data Governance: Ensure data quality, integrity, and compliance through robust governance practices. Cross-functional Collaboration: Partner with analysts, scientists, and product teams to translate business needs into technical solutions. Automation & Monitoring: Automate data workflows and implement monitoring and alerting systems. Mentorship: Guide junior engineers and promote best practices in data engineering and ML integration. Innovation: Stay current with emerging technologies in data engineering, ML, and analytics. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of experience in data engineering with a strong focus on Python and big data tools. Solid understanding of machine learning concepts and statistical analysis techniques. Proficiency in SQL and Python; experience with PySpark is highly desirable. Experience with cloud platforms (AWS, Azure, or GCP) and data tools (e.g., Glue, Data Factory, Dataflow). Familiarity with data warehousing and lakehouse architectures. Knowledge of data modeling techniques (e.g., star schema, snowflake schema). Experience with version control systems like Git. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Your future duties and responsibilities Required qualifications to be successful in this role Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 day ago
3.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At Lilly, we unite caring with discovery to make life better for people around the world. We are a global healthcare leader headquartered in Indianapolis, Indiana. Our employees around the world work to discover and bring life-changing medicines to those who need them, improve the understanding and management of disease, and give back to our communities through philanthropy and volunteerism. We give our best effort to our work, and we put people first. We’re looking for people who are determined to make life better for people around the world. Lilly’s Purpose At Lilly, we unite caring with discovery to make life better for people around the world. We are a global healthcare leader headquartered in Indianapolis, Indiana. Our 35,000 employees around the world work to discover and bring life-changing medicines to those who need them, improve the understanding and management of disease, and give back to our communities through philanthropy and volunteerism. We give our best effort to our work, and we put people first. We’re looking for people who are determined to make life better for people around the world. Come bring to life technologies to lead in Pharma-tech! The Enterprise Data organization has developed an integrated and intuitive data and analytics platform. This platform enables Lilly team members to quickly ingest, transform, consume, and analyze data sets in statistical environments, advanced analytics environments, and BI/visualization tools. Contributors can easily ingest, prepare, and analyze new data sets (cleanse, enhance, publish) for others to utilize. What You Will Be Doing Reporting to the Manager LCCI Tech@Lilly, In this role, you will work closely with data engineers, business analyst, quality, data owners and stakeholders to efficiently manage, monitor and optimize the on-going flow of quality data to consumers for data sharing and analytics. Key Responsibilities Monitor Data Pipelines: Ensure the timely and accurate flow of data through pipelines. Incident Management: Detect, troubleshoot, and resolve issues in data pipelines to maintain data quality and integrity. End user communication: Inform downstream teams of incidents or anomalies with data, quality, availability, or performance and expected resolution. Root Cause Analysis: Review Incidents and Problems to Learn and Improve future processes. Performance Optimization: Continuously optimize data pipeline performance ensuring timely availability of data. Cloud FinOps: Continuously monitor data pipeline cost identifying and implementing cost-saving opportunities without compromising performance. Data Quality Assurance: Implement measures to ensure data accuracy, consistency, and reliability. Lifecycle management: Assess, execute, and test any necessary product upgrades to enabling services. Cyber: Apply any required patches or changes for identified security vulnerabilities. Configuration Changes: Execute configuration changes for configurable core components. Automation: Develop and implement automation for monitoring and incident management processes. Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and improve pipeline performance. Documentation: Maintain comprehensive documentation of data operations processes, monitoring procedures, and issue resolution protocols. Security and Compliance: Ensure data security and compliance with relevant processes and standard operating procedures. Validation: Execute periodic reviews to ensure system remains secure and in a validated state. Consult and Advise: On use of data products. Strong decision-making capabilities and the ability to drive initiatives with clarity and purpose. Qualifications / Skills: Bachelor's Degree or equivalent in Information Technology or related field. 3 - 8 years of work experiences including Information Technology experience in multiple technical areas and roles. Willingness to work in rotational shifts. Strong analytical skills to troubleshoot and resolve issues quickly and efficiently. Strong collaboration skills to work effectively with cross-functional-teams including data engineers, business analyst, data scientist and business stakeholders. Strong communication skills to articulate technical concepts to non-technical stakeholders and document processes. Flexibility to adapt to new technologies and methodologies as the data and technical landscape evolves. Mastery of ETL processes and tools and SQL Minimum of 3 years hands-on experience with AWS Services and Security (S3, RDS, Lambda, Glue, EC2, Redshift, CloudWatch, CloudTrail, IAM) Experience with CI/CD, GitHub Actions and Apache Airflow ITIL Foundations Certified or experience with incident, problem, event and change management best practices. AWS Foundations Certified and/or AWS Certified DevOps Engineer Experience with agile frameworks (like Kanban, SAFe, etc.) and solid understanding of associated practices and tools. A high level of intellectual curiosity, external perspective, strong learning agility and innovation interest. Additional Skills/Preferences Experience working in AWS Data Lakehouse architecture. Deep understanding of privacy regulations about Information technology and security. Position located in Hyderabad, India. Lilly is dedicated to helping individuals with disabilities to actively engage in the workforce, ensuring equal opportunities when vying for positions. If you require accommodation to submit a resume for a position at Lilly, please complete the accommodation request form (https://careers.lilly.com/us/en/workplace-accommodation) for further assistance. Please note this is for individuals to request an accommodation as part of the application process and any other correspondence will not receive a response. Lilly does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status. #WeAreLilly
Posted 1 day ago
12.0 years
7 - 8 Lacs
Hyderābād
On-site
Job title Lead – Business Analytics Location Hyderabad % of travel expected Travel required as per business need Job type Permanent and Full time About the job Go-To-Market Capabilities (GTMC) Hub is an internal Sanofi resource organization based in Hyderabad, India and is set up to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions. GTMC strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi globally. At Sanofi, we are leveraging analytics and technology on behalf of patients around the world. We are seeking those who have a passion for using data, analytics and insights to drive decision-making that will allow us to tackle some of the world’s greatest health threats. Within our Insights & Analytics organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Deliverables support planning and decision making across multiple functional areas such as finance, manufacturing, product development and commercial. In addition to ensuring high-quality deliverables, our team drives synergies across the franchises, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. We are looking for a Lead to support our analytics and reporting team. Robust analytics and reporting are priorities for our businesses, so it is essential to have someone who understands and aspires to implement innovative analytics techniques to drive insights generation across the GBUs. Key Responsibilities This role will be responsible to create synergies and provide functional and operational direction to multiple processes across the various GBU operations and therapy areas, so that the cost savings are achieved with deliveries optimized through multi-layered teams. Act as a strategic thought partner to Business Analytics operations across GBUs Develops and maintain expertise on key trends / developments in the industry Provide strategic inputs to the deliverables and ensures delivery as per plan with accuracy Identify areas for innovation and implement the same Leads teams of business analytics professionals Quantitative Strategy Consultant partner with a broad range of internal cross-BU client teams across regions to deliver and advise seamless solutions across the most specific marketing and sales issues facing the stakeholders Evaluate the effectiveness of various promotional and marketing programs leveraging secondary data sources, Reporting platforms / ETL Setups, Impact of channel dynamics: Field Force/Multi-Channel Modelling (New Commercial Model) Coach and develop team; Mentor the team on day to day as well as exceptional cases/ situations Monitor progress of work and provide solution to issues and queries Resource and back-up planning for business continuity Share best practices and serve as a change agent and facilitator for operational excellence Support delivery of projects in terms of resourcing, quality, timeliness, efficiency, and high standards for deliverables made by the teams Secure adherence to compliance procedures and internal / operational risk controls in accordance with any and all applicable regulatory standards Lead and implement operational excellence projects within the team in alignment with overall direction from senior leadership using appropriate process improvement tools Ensure creation and development of tools, technology and process in order to constantly improve quality and productivity Maintain effective relationship with the end stakeholders with an end objective of client delight Technical skills Expert in Relational database technologies and concepts Capable of working on multiple projects simultaneously Hands-on experience of using analytical tools like PowerBI, SQL, Snowflake, advanced Excel (including VBA), etc. (Python is good to have) Experience of developing, refreshing and managing dashboards Experience with pharmaceutical datasets (e.g. IQVIA, Symphony, Komodo, Veeva, Salesforce) preferable Understanding of pharmaceutical development, manufacturing, supply chain, sales and marketing functions is preferable Experience 12+ years of relevant work experience, with a solid understanding of principles, standards, and best practices of Pharma Commercial Analytics. Education Bachelors or Masters degree in areas such as Information Science/Operations/Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field. Soft skills Strong leadership skills, learning agility, ability to manage ambiguous environments and to adapt to changing needs of the business Good interpersonal and negotiation skills Strong presentation skills Team player who is curious, dynamic, result oriented Ability to operate effectively in an international matrix environment, with ability to work across time zones Demonstrated leadership and management in driving innovation and automation leveraging advanced analytical techniques Ability to deal with ambiguity and conflicting priorities Highly proficient in Stakeholder Management, Project Management and People Management Languages Excellent English communication skills – written and spoken Why chose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Play an instrumental part in creating best practice within our Go-to-Market Capabilities. Pursue progress, discover extraordinary! Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing a desire to make miracles happen. So, let’s be those people! At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity! Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! null
Posted 1 day ago
10.0 years
1 - 4 Lacs
Chennai
On-site
Job Family: Data Science & Analysis (India) Travel Required: Clearance Required: What You Will Do: Lead and execute data engineering projects from inception to completion, ensuring timely delivery and high quality. Build and optimize data architectures for operational and analytical purposes. Collaborate with cross-functional teams to gather and define data requirements. Implement data quality, data governance, and data security practices. Manage and optimize cloud-based data platforms (Azure,AWS). Develop and maintain Python/PySpark libraries for data ingestion, Processing and integration with both internal and external data sources. Design and optimize scalable data pipelines using Azure data factory and Spark(Databricks) Work with stakeholders, including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Develop frameworks for data ingestion, transformation, and validation. Mentor junior data engineers and guide best practices in data engineering. Evaluate and integrate new technologies and tools to improve data infrastructure. Ensure compliance with data privacy regulations (HIPAA, etc.). Monitor performance and troubleshoot issues across the data ecosystem What You Will Need: Bachelors or master’s degree in computer science, Information Systems, Statistics, Math, Engineering, or related discipline. Minimum 10 + years of solid hands-on experience in data engineering and cloud services. Experience in leading and mentoring Team members. Good Experience in Azure data factory (ADF), Databricks , Python and PySpark. Good experience in modern data storage concepts data lake, lake house. Experience in other cloud services (AWS) and data processing technologies will be added advantage. Ability to enhance , develop and resolve defects in ETL process using cloud services. Experience handling large volumes (multiple terabytes) of incoming data from clients and 3rd party sources in various formats such as text, csv, EDI X12 files and access database. Experience with one or more programming languages such as Java, .Net in an application development environment is highly preferred. Experience with software development methodologies (Agile, Waterfall) and version control tools Highly motivated, strong problem solver, self-starter, and fast learner with demonstrated analytic and quantitative skills. Good communication skill. What Would Be Nice To Have: Experience in different cloud providers Experience in Programming Experience in DevOps What We Offer: Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. About Guidehouse Guidehouse is an Equal Opportunity Employer–Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at 1-571-633-1711 or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or guidehouse@myworkday.com. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse’s Ethics Hotline. If you want to check the validity of correspondence you have received, please contact recruiting@guidehouse.com. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant’s dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
Posted 1 day ago
5.0 years
0 Lacs
Chennai
On-site
DESCRIPTION About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. About Team The RBS team is an integral part of Amazon online product lifecycle and buying operations. The team is designed to ensure Amazon remains competitive in the online retail space with the best price, wide selection and good product information. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The tasks handled by this group have a direct impact on customer buying decisions and online user experience. Overview of the role: An candidate will be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. You will be detail-oriented and organized, capable of handling multiple projects at once, and capable of dealing with ambiguity and rapidly changing priorities. You will have expertise in process optimizations and systems thinking and will be required to engage directly with multiple internal teams to drive business projects/automation for the RBS team. Candidates must be successful both as individual contributors and in a team environment, and must be customer-centric. Our environment is fast-paced and requires someone who is flexible, detail-oriented, and comfortable working in a deadline-driven work environment. Responsibilities Include Works across team(s) and Ops organization at country, regional and/or cross regional level to drive improvements and enables to implement solutions for customer, cost savings in process workflow, systems configuration and performance metrics. Basic Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field Proficiency in automation using Python Excellent oral and written communication skills Experience with SQL, ETL processes, or data transformation Preferred Qualifications Experience with scripting and automation tools Familiarity with Infrastructure as Code (IaC) tools such as AWS CDK Knowledge of AWS services such as SQS, SNS, CloudWatch and DynamoDB Understanding of DevOps practices, including CI/CD pipelines and monitoring solutions Understanding of cloud services, serverless architecture, and systems integration Key job responsibilities As a Business Intelligence Engineer in the team, you will collaborate closely with business partners, architect, design, implement, and BI projects & Automations. Responsibilities: Design, development and ongoing operations of scalable, performant data warehouse (Redshift) tables, data pipelines, reports and dashboards. Development of moderately to highly complex data processing jobs using appropriate technologies (e.g. SQL, Python, Spark, AWS Lambda, etc.) Development of dashboards and reports. Collaborating with stakeholders to understand business domains, requirements, and expectations. Additionally, working with owners of data source systems to understand capabilities and limitations. Deliver minimally to moderately complex data analysis; collaborating as needed with Data Science as complexity increases. Actively manage the timeline and deliverables of projects, anticipate risks and resolve issues. Adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Internal job description Retail Business Service, ARTS is a growing team that supports the Retail Efficiency and Paid Services business and tech teams. There is ample growth opportunity in this role for someone who exhibits Ownership and Insist on the Highest Standards, and has strong engineering and operational best practices experience. Basic qualifications: 5+ years of relevant professional experience in business intelligence, analytics, statistics, data engineering, data science or related field. Experience with Data modeling, SQL, ETL, Data Warehousing and Data Lakes. Strong experience with engineering and operations best practices (version control, data quality/testing, monitoring, etc.) Expert-level SQL. Proficiency with one or more general purpose programming languages (e.g. Python, Java, Scala, etc.) Knowledge of AWS products such as Redshift, Quicksight, and Lambda. Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams. Preferred qualifications: Experience with data-specific programming languages/packages such as R or Python Pandas. Experience with AWS solutions such as EC2, DynamoDB, S3, and EMR. Knowledge of machine learning techniques and concepts. BASIC QUALIFICATIONS 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France