Jobs
Interviews

23992 Etl Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

3 - 10 Lacs

Chennai

On-site

DESCRIPTION About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. About Team The RBS team is an integral part of Amazon online product lifecycle and buying operations. The team is designed to ensure Amazon remains competitive in the online retail space with the best price, wide selection and good product information. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The tasks handled by this group have a direct impact on customer buying decisions and online user experience. Overview of the role: An candidate will be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. You will be detail-oriented and organized, capable of handling multiple projects at once, and capable of dealing with ambiguity and rapidly changing priorities. You will have expertise in process optimizations and systems thinking and will be required to engage directly with multiple internal teams to drive business projects/automation for the RBS team. Candidates must be successful both as individual contributors and in a team environment, and must be customer-centric. Our environment is fast-paced and requires someone who is flexible, detail-oriented, and comfortable working in a deadline-driven work environment. Responsibilities Include: Works across team(s) and Ops organization at country, regional and/or cross regional level to create automated solutions for customer, cost savings through process automation, systems configuration and performance metrics. Has logical reasoning, critical thinking, problem solving abilities for automation scripting. Has framework engineering abilities and follows automation development best practices. Automate user interactions and API with existing tools/solutions, build localized small scale solutions for quick deployment, build python scripts to automate day to day, repeatable activities within a team Optionally, an Automation Expert may build front end UI for web application Prioritizes projects and feature sets, evaluate and set stakeholders expectations for Amazon’s marketplace: country, regional and/ or cross regional level. Writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Applies rigorous approach to problem solving. Credible business partner to Amazon’s Operations network. Possesses relevant understanding and experience on automation processes and workflow. Able to dive deep in the automation process to correct under-performing parts and acts as a trouble shooter. Basic Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field Proficiency in automation using Python Excellent oral and written communication skills Experience with SQL, ETL processes, or data transformation Preferred Qualifications Experience with scripting and automation tools Familiarity with Infrastructure as Code (IaC) tools such as AWS CDK Knowledge of AWS services such as SQS, SNS, CloudWatch and DynamoDB Understanding of DevOps practices, including CI/CD pipelines and monitoring solutions Understanding of cloud services, serverless architecture, and systems integration Key job responsibilities As an Automation Expert you are responsible for working with cross-functional teams to develop small-medium scale long term automated solutions using API, Selenium, Python and other tools, and utilize automation metrics to determine improvement opportunities. Working in a dynamic environment, you will be responsible for monitoring key success metrics. You will be expected to quickly become a subject matter expert of automation, and help business leaders improve automation penetration, make better decisions, and generate value. In this role, you are expected to work closely with your peers, operations managers to understand potential business automation use cases, and convert them into automated solutions. BASIC QUALIFICATIONS Experience defining requirements and using data and metrics to draw business insights Experience with SQL or ETL 2+ years of tax, finance or a related analytical field experience PREFERRED QUALIFICATIONS Knowledge of Python, VBA, Macros, Selenium scripts Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 9 hours ago

Apply

2.0 years

6 - 9 Lacs

Chennai

On-site

DESCRIPTION Are you passionate about transforming complex data into actionable business insights at a global scale? RBS Brand Experience (formerly APIE) is seeking an experienced Business Intelligence Engineer who thrives on ambiguity and can decipher evolving business needs to shape data-driven solutions. As a Business Intelligence Engineer, you'll be at the intersection of data and business strategy, translating complex requirements into actionable analytics solutions. You'll partner with stakeholders to unlock insights that elevate our global work authorization experiences and drive program scalability. Key job responsibilities A successful candidate will demonstrate: Advanced SQL skills for writing complex queries and stored procedures to extract, transform, and analyze large datasets Proficiency in Python, particularly with libraries like pandas and PySpark, for data manipulation and ETL processes Strong analytical and problem-solving capabilities, with the ability to translate business requirements into efficient data solutions Experience in designing and implementing scalable ETL pipelines that can handle large volumes of data Expertise in data modeling and database optimization techniques to improve query performance Ability to work with various data sources and formats, integrating them into cohesive data structures Skill in developing and maintaining data warehouses and data lakes Proficiency in using BI tools to create insightful visualizations and dashboards Ability to thrive in ambiguous situations, identifying data needs and proactively proposing solutions Excellence in communicating technical concepts and data insights to both technical and non-technical audiences Customer-centric mindset with a focus on delivering data solutions that drive business value" A day in the life You'll work closely with Product Managers, Software Developers, and business stakeholders to: Build and maintain dashboards that drive business decisions Perform deep-dive analyses to uncover actionable insights Develop and automate data processes to improve efficiency Present findings and recommendations to leadership Partner with global teams to implement data-driven solutions BASIC QUALIFICATIONS 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with scripting language (e.g., Python, Java, or R) Experience building and maintaining basic data artifacts (e.g., ETL, data models, queries) Experience applying basic statistical methods (e.g. regression) to difficult business problems Experience gathering business requirements, using industry standard business intelligence tool(s) to extract data, formulate metrics and build reports PREFERRED QUALIFICATIONS Bachelor's degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis Experience in designing and implementing custom reporting systems using automation tools Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 9 hours ago

Apply

5.0 years

2 - 3 Lacs

Chennai

On-site

This is a data engineer position - a programmer responsible for the design, development implementation and maintenance of data flow channels and data processing systems that support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. The overall objective is defining optimal solutions to data collection, processing, and warehousing. Must be a Spark Java development expertise in big data processing, Python and Apache spark particularly within banking & finance domain. He/She designs, codes and tests data systems and works on implementing those into the internal infrastructure. Responsibilities: Ensuring high quality software development, with complete documentation and traceability Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance Ensure efficient data storage and retrieval using Big Data Implement best practices for spark performance tuning including partition, caching and memory management Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins) Work on batch processing frameworks for Market risk analytics Promoting unit/functional testing and code inspection processes Work with business stakeholders and Business Analysts to understand the requirements Work with other data scientists to understand and interpret complex datasets Qualifications: 5- 8 Years of experience in working in data eco systems. 4-5 years of hands-on experience in Hadoop , Scala , Java , Spark , Hive , Kafka, Impala, Unix Scripting and other Big data frameworks. 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning) Experienced in working with large and multiple datasets and data warehouses Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets. Strong analytic skills and experience working with unstructured datasets Ability to effectively use complex analytical, interpretive, and problem-solving techniques Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira Experience with external cloud platform such as OpenShift, AWS & GCP Experience with container technologies (Docker, Pivotal Cloud Foundry) and supporting frameworks (Kubernetes, OpenShift, Mesos) Experienced in integrating search solution with middleware & distributed messaging - Kafka Highly effective interpersonal and communication skills with tech/non-tech stakeholders. Experienced in software development life cycle and good problem-solving skills. Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environment Education: Bachelor’s/University degree or equivalent experience in computer science, engineering, or similar domain - Job Family Group: Technology - Job Family: Data Architecture - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 9 hours ago

Apply

0 years

0 Lacs

Chennai

On-site

Job Profile Summary Responsible for leading the development and optimization of data platforms and analytics systems, ensuring alignment with business requirements and best practices, while leading data security, training, and project management efforts for successful implementation and user empowerment. 1. Lead the development and maintenance of data platforms, data product factory and analytics systems to support data-driven decision-making. 2. Design and optimize data warehouse architecture to support efficient storage and retrieval of large datasets. 3. Enable self-service data exploration capabilities for users to analyze and visualize data independently. 4. Develop reporting and analysis applications to generate insights from data for business stakeholders. 5. Design and implement data models to organize and structure data for analytical purposes. 6. Implement data security and federation strategies to ensure the confidentiality and integrity of sensitive information. 7. Optimize business intelligence production processes and adopt best practices to enhance efficiency and reliability. 8. Drive and maintain relationships with vendors and oversee project management activities to ensure timely and successful implementation of data platforms and data product factory Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Relevant work experience as in data engineering based on the following number of years: Lead I: Five (5) years Lead II: Six (6) years Knowledge, Skills and Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 9 hours ago

Apply

0 years

5 - 7 Lacs

Chennai

On-site

10- 15 yrs experience in Databrick and exposure to Data/AI platforms. Expertise in Pyspark/Data factory Develop efficient Extract, Load and Transform (ELT/ETL) processes to facilitate seamless data integration, transformation, and loading from various sources into the data platform using Azure and Databricks This includes inbound and outbound data processes. Conduct and support unit and system testing/ SIT/ UAT Support platform deployment and post go-live support Expert in Pyspark and Data Factory About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 9 hours ago

Apply

5.0 years

1 - 7 Lacs

Chennai

On-site

Job Description –ODI Developer Location : Equitas Office, Backside Vikatan Office, 757, Vasan Ave, Anna Salai, Thousand Lights, Chennai, Tamil Nadu 600002 Job Type: Full-Time Experience: 5+ years Job Summary: We are hiring a Lead Data Engineer to architect and lead enterprise data integration initiatives. This role requires deep technical expertise in data engineering and leadership experience. Familiarity with Oracle Data Integrator (ODI) is preferred, especially in environments using the Oracle stack. Key Responsibilities: Architect and oversee the implementation of scalable, reliable data pipelines. Define standards and best practices for data integration and ETL development. Lead a team of data engineers and mentor junior staff. Collaborate with stakeholders to understand business data needs and translate them into technical solutions. Ensure adherence to data governance, security, and compliance requirements. Requirements: 5+ years of experience in data engineering, including team leadership roles. Deep knowledge of ETL architecture and data integration frameworks. Experience with any ETL tool (ODI is mandatory). Strong SQL, data modeling, and performance tuning skills. Experience with cloud data platforms and modern data architectures. Excellent leadership, communication, and stakeholder management skills. Knowledge on real-time or near-real-time data streaming (e.g., Kafka). Job Type: Full-time Pay: ₹12,817.62 - ₹60,073.88 per month Benefits: Health insurance Provident Fund Experience: 5S: 5 years (Preferred) Location: Chennai, Tamil Nadu (Required) Work Location: In person

Posted 9 hours ago

Apply

0 years

2 - 8 Lacs

Chennai

On-site

Date live: 08/01/2025 Business Area: Finance Area of Expertise: Finance Contract: Permanent Reference Code: JR-0000063715 Join us as an “Assistant VP" at Barclays, where you will be involved in functional design, data, end-to-end-process and controls, delivery, and functional testing. You’ll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful in this role, you should have: Support Development of dashboards in SAP Analytics Cloud and Tableau ; prefer primary experience in SAP Analytics cloud and SAP related toolsets Able to develop process workflow and manage ETL tools like SAP BW, Alteryx etc Able to provide design solutions for Internal reporting problem statement and business requirements with quick delivery using tactical solutions and able to connect with the strategic roadmap as well To act as a Business analyst supporting the function thinking from a strategic point of view delivering MI views that enables analytics and supports quick decision making. To support business on an agile basis in delivering the requirements which is critical in dev ops model Build innovative dashboards on a sprint basis with key focus on controls and governance structure Able to visually enhance an analytical view from the legacy excel/PPT model Adhere to all the IR Controls and develop and implement robust controls mechanism in all processes managed Some other highly valued skills may include: Knowledge in Business Intellgence platforms primarily in SAP Analytics cloud and able to work in data management tools Project management /scrum master capabilities to drive prioritization Experience around designing MI dashboards and insights Broad business and industry knowledge and experience You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role will be based out of Chennai. Purpose of the role To develop business capabilities for Finance through key stages of functional design, data, end-to-end-process and controls, delivery, and functional testing. Accountabilities Functional Design: leveraging best practice concepts, and in collaboration with Line SMEs, support options analysis and recommendations as part of decision making. Data Analysis/Modelling/Governance: design conceptual data model underpinning all phases of the processes, and governance requirements in accordance with GDMS standards and principles. End-to-End Process & Controls - development of target process and controls design/documentation and operational runbooks and aligning these components with organizational and role/service model design definitions. . Delivery/Implementation Support: update design/functional requirements throughout the development cycle, and resolve RAIDS related to functional requirements and business processes. Project management for change programmes that have limited technology investment. Functional Testing: develop scripts and data to test alignment to requirement definitions, ahead of user testing cycles. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 9 hours ago

Apply

1.0 years

4 - 6 Lacs

Chennai

On-site

DESCRIPTION Want to participate in building the next generation of online payment system that supports multiple countries and payment methods? Amazon Payment Services (APS) is a leading payment service provider in MENA region with operations spanning across 8 countries and offers online payment services to thousands of merchants. APS team is building robust payment solution for driving the best payment experience on & off Amazon. Over 100 million customers send tens of billions of dollars moving at light-speed through our systems annually. We build systems that process payments at an unprecedented scale with accuracy, speed and mission-critical availability. We innovate to improve customer experience, with support for currency of choice, in-store payments, pay on delivery, credit and debit card payments, seller disbursements and gift cards. Many new exciting & challenging ideas are in the works. Key job responsibilities Data Engineers focus on managing data requests, maintaining operational excellence, and enhancing core infrastructure. You will be collaborating closely with both technical and non-technical teams to design and execute roadmaps BASIC QUALIFICATIONS 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 9 hours ago

Apply

5.0 years

10 Lacs

Noida

On-site

We, at Beesolver Technologies looking for an experienced Senior Laravel/PHP Developer with a strong foundation in backend development, MySQL, and e-commerce systems. The ideal candidate will be responsible for designing scalable applications, integrating data pipelines, and driving feature delivery in a collaborative Agile team setup. Job Title: PHP Laravel Developer – Ecommerce & Data Integration Job Type: Full-Time Experience: 5+ Years Location: Noida/Chd.-Mohali WORK TIMING: 10 am-7 PM Industry: IT Services / E-commerce / SaaS Functional Area: Software Development Roles and Responsibilities: Develop secure and scalable backend applications using Laravel (PHP) Design, optimize, and manage MySQL schemas and performance tuning Build and maintain ETL/ESB pipelines for data synchronization across systems Work with Laravel queues, events, jobs, and scheduler Develop REST APIs and manage third-party integrations (shipping, CRM, payments) Collaborate with cross-functional Agile teams: developers, testers, product owners Implement and follow best practices in code quality, testing, and CI/CD Technical Skills: 5+ years of PHP development experience (3+ years in Laravel) Strong experience with MySQL (joins, indexes, stored procedures, tuning) ETL/ESB knowledge using Laravel Jobs, Talend, or Apache NiFi Skilled in REST API design, integration, and OAuth/Webhooks Ecommerce Domain Knowledge Hands-on experience with major ecommerce platforms Strong understanding of ecommerce business processes including: Product catalogs, variants, and SKU management Order lifecycle management (cart, checkout, order placement, payment, fulfilment, return/refund) Inventory management, stock sync, and warehouse integration Shipping and logistics API integrations ERP and CRM system integration for unified data flow Knowledge of ecommerce KPIs and data reporting (RFM, CLTV, conversion rate) Preferred Skills (Good to Have) Experience with RabbitMQ, Kafka, or any messaging system Exposure to Talend, Apache NiFi, or Pentaho Familiarity with DDD, clean/hexagonal architecture patterns Basic experience on cloud platforms: AWS, Azure, or GCP Strong experience with MySQL (joins, indexes, stored procedures, tuning) ETL/ESB knowledge using Laravel Jobs, Talend, or Apache NiFi Skilled in REST API design, integration, and OAuth/Webhooks Education: UG: B. Tech/B.E. in Computer Science, IT or BCA PG: MCA, M. Tech (preferred) Job Types: Full-time, Permanent Pay: Up to ₹89,785.33 per month Benefits: Paid time off Provident Fund Schedule: Day shift Supplemental Pay: Yearly bonus Work Location: In person

Posted 9 hours ago

Apply

5.0 years

1 - 10 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a talented and motivated Data Engineer to join our growing data team. You will play a key role in building scalable data pipelines, optimizing data infrastructure, and enabling data-driven solutions. Primary Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for batch and real-time data processing Build and optimize data models and data warehouses to support analytics and reporting Collaborate with analysts and software engineers to deliver high-quality data solutions Ensure data quality, integrity, and security across all systems Monitor and troubleshoot data pipelines and infrastructure for performance and reliability Contribute to internal tools and frameworks to improve data engineering workflows Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: 5+ years of experience working on commercially available software and / or healthcare platforms as a Data Engineer 3+ years of solid experience designing and building Enterprise Data solutions on cloud 1+ years of experience developing solutions hosted within public cloud providers such as Azure or AWS or private cloud/container-based systems using Kubernetes/OpenShift Experience with some of the modern relational databases Experience with Data warehousing services preferably Snowflake Experience in using modern software engineering and product development tools including Agile / SAFE, Continuous Integration, Continuous Delivery, DevOps etc. Solid experience of operating in a quickly changing environment and driving technological innovation to meet business requirements Skilled at optimizing SQL statements Subject matter expert on Cloud technologies preferably Azure and Big Data ecosystem Preferred Qualifications: Experience with real-time data streaming and event-driven architectures Experience building Big Data solutions on public cloud (Azure) Experience building data pipelines on Azure with skills Databricks spark, scala, Azure Data factory, Kafka and Kafka Streams, App services, Az Functions Experience developing RESTful Services in .NET, Java or any other language Experience with DevOps in Data engineering Experience with Microservices architecture Exposure to DevOps practices and infrastructure-as-code (e.g., Terraform, Docker) Knowledge of data governance and data lineage tools Ability to establish repeatable processes, best practices and implement version control software in a Cloud team environment At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 9 hours ago

Apply

3.0 years

4 - 6 Lacs

Noida

On-site

Date: Aug 1, 2025 Company: Location: NOIDA, IN, 201301 Noida, IN, 201301 . Customer Support – BI & Reporting Analyst Job Summary: We are looking for a skilled and analytical BI & Reporting Analyst to support our Customer Support and team through advanced reporting, visualization, and data insights. This role is critical in translating ITIL-aligned support data into actionable dashboards and reports to drive performance, SLA adherence, and continuous improvement. Key Responsibilities: Design, build, and maintain interactive dashboards in Power BI that visualize KPIs across Incident, Problem, Change, and Request Management processes. Collaborate with ITSM and Customer Support teams to identify reporting needs aligned with ITIL practices and service goals. Translate raw data from ITSM tools (e.g., Jira Service Management, ServiceNow, BMC Remedy, ) into clean, structured datasets suitable for reporting. Provide insights into support performance, SLA compliance, ticket volumes, resolution times, backlog trends, and user satisfaction metrics. Develop data models, queries, and metrics that support operational and strategic decision-making. Ensure accuracy, consistency, and availability of real-time and historical data for dashboards and reports. Document and maintain data definitions, report logic, and dashboard usage guidelines. Support audits, compliance tracking, and executive reporting with on-demand and scheduled data visualizations. Continuously identify opportunities to automate reporting and improve data accessibility and storytelling. Required Qualifications: 3+ years of hands-on experience designing Power BI dashboards and reports, preferably in an IT or customer support-focused organization. Strong knowledge of ITIL frameworks and ITSM processes (especially Incident, Problem, and Change Management). Experience working with ITSM platforms such as Jira Service Management, ServiceNow, or BMC . Understanding of support operations, service metrics (SLAs, KPIs), and reporting requirements in customer support or service desk environments. Strong analytical thinking and attention to detail. Familiarity with Excel, SQL, and other data tools. Preferred Qualifications: ITIL Foundation Certification (v3 or v4). Experience with automated data pipelines or ETL tools. Experience integrating data from multiple systems (CRM, ITSM, HR systems, etc.). Familiarity with tools like Tableau or Excel VBA as secondary platforms. Our Culture & Values At Ingenico, we thrive on innovation, collaboration, and delivering customer value. Our values—Trust, Innovation, and Care—define how we work and grow together. We challenge the status quo, push boundaries, and deliver results as a team. Diversity & Inclusion Ingenico is proud to be an equal opportunity employer. We are committed to fostering an inclusive environment where every employee feels respected and empowered. Ready to Make an Impact? Join us and help shape the future of payments across Asia. Apply now. Learn more about Ingenico: Ingenico Global Website: https://www.ingenico.com Ingenico LinkedIn: https://www.linkedin.com/company/ingenico/

Posted 9 hours ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Noida

On-site

The Position We are seeking a skilled Data Engineer to join our dynamic team. In this role, you will play a pivotal part in designing and implementing custom solutions that support complex financial and IP calculations, reporting, and data transformations. Your work will directly contribute to improving our clients' operational efficiency and decision-making capabilities. What you will do: Problem-Solving: Develop innovative solutions to complex challenges in financial calculations, rights management, and process optimization. Data Engineering Solutions: Design, build, and maintain scalable data pipelines for migration, cleansing, transformation, and integration tasks, ensuring high-quality data outcomes. Database Development & Maintenance: Configure, implement, and refine stored procedures and queries to ensure optimal performance, scalability, and maintainability of database systems. ETL & Data Migration: Develop robust ETL (Extract, Transform, Load) processes that integrate data from diverse sources, ensuring seamless migration and transformation for downstream analytics and reporting. Automation & Scripting: Create and implement automated scripts and tools to streamline routine database tasks, reduce manual intervention, and improve overall operational efficiency. Collaboration: Partner with cross-functional teams to align data engineering efforts with broader business objectives and deliver seamless solutions that drive value across the organization. IP Commerce Data Expertise: Leverage deep knowledge of financial and rights data to develop creative solutions that address client needs and advance business goals. Process Improvement: Continuously identify opportunities to optimize workflows, automate repetitive tasks, and enhance efficiency in data processing and delivery. What you will bring to the role : Must-Have: Minimum 3-5 years of experience in an database developer or analyst position. Bachelors in Computer Science, Engineering or equivalent work experience. Exceptional analytical thinking and problem-solving capabilities. Strong verbal and written communication skills with the ability to articulate technical concepts clearly. Proficiency in analyzing complex financial or IP data sets. Hands-on experience with engineering principles, including designing and implementing scalable solutions. Strong attention to detail and commitment to ensuring data accuracy and integrity. Preferred: Experience working with SQL and/or Python for data manipulation and analysis. Experience working in finance or IP-related industries, with an understanding of their unique challenges and requirements. Familiarity with handling large-scale datasets and cloud-based platforms (e.g., AWS, Azure, Google Cloud). Knowledge of DevOps practices and CI/CD pipelines to streamline database management and deployment. Understanding of data warehousing architectures and business intelligence tools for advanced analytics. Certifications in relevant database technologies (e.g., Microsoft Certified: Azure Database Administrator Associate or Oracle Certified Professional) are a bonus Shift - Flexible (US & UK shift) Equal Employment Opportunity Rightsline is an equal opportunity workplace. All candidates will be afforded equal opportunity through the recruiting process. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, disability, gender identity and/or expression. We are dedicated to growing a diverse team of highly talented individuals and creating an inclusive environment where everyone feels empowered to bring their authentic selves to work. Apply Today If you want to join a company that strives for a mission, purpose and making an impact, we encourage you to apply today.

Posted 9 hours ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview PepsiCo Data BI & Integration Platforms is seeking an experienced Cloud Platform Databricks SME, responsible for overseeing the Platform administration, Security, new NPI tools integration, migrations, platform maintenance and other platform administration activities on Azure/AWS. The ideal candidate will have hands-on experience with Azure/AWS services - Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Databricks Subject Matter Expert (SME) plays a pivotal role in admin, security best practices, platform sustain support, new tools adoption, cost optimization, supporting new patterns/design solutions using the Databricks platform. Here’s a breakdown of typical responsibilities: Core Technical Responsibilities Architect and optimize big data pipelines using Apache Spark, Delta Lake, and Databricks-native tools. Design scalable data ingestion and transformation workflows, including batch and streaming (e.g., Kafka, Spark Structured Streaming). Create integration guidelines to configure and integrate Databricks with other existing security tools relevant to data access control. Implement data security and governance using Unity Catalog, access controls, and data classification techniques. Support migration of legacy systems to Databricks on cloud platforms like Azure, AWS, or GCP. Manage cloud platform operations with a focus on FinOps support, optimizing resource utilization, cost visibility, and governance across multi-cloud environments. Collaboration & Advisory Act as a technical advisor to data engineering and analytics teams, guiding best practices and performance tuning. Partner with architects and business stakeholders to align Databricks solutions with enterprise goals. Lead proof-of-concept (PoC) initiatives to demonstrate Databricks capabilities for specific use cases. Strategic & Leadership Contributions Mentor junior engineers and promote knowledge sharing across teams. Contribute to platform adoption strategies, including training, documentation, and internal evangelism. Stay current with Databricks innovations and recommend enhancements to existing architectures. Specialized Expertise (Optional but Valuable) Machine Learning & AI integration using MLflow, AutoML, or custom models. Cost optimization and workload sizing for large-scale data processing. Compliance and audit readiness for regulated industries. Qualifications Bachelor’s degree in computer science. At least 12 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 5 years in a Platform admin role Strong understanding of data security principles and best practices. Expertise in Databricks platform, security features, Unity Catalog, and data access control mechanisms. Experience with data classification and masking techniques. Strong understanding of cloud cost management, with hands-on experience in usage analytics, budgeting, and cost optimization strategies across multi-cloud platforms. Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in Azure/AWS big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Deep expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS/Databricks platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.

Posted 9 hours ago

Apply

6.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

We are seeking an 6+ years experienced Data Modeler to join our Data & Analytics (D&A) Service Line team. The ideal candidate will possess strong expertise in data modeling techniques (conceptual, logical, and physical) and have hands-on experience with Persistent Data Layer (PDL) and Logical Data Layer (LDL) design. A solid functional understanding of the payments domai covering payment gateways, cards, cashflows, and related processes is essential for this role. The Data Modeler will collaborate closely with business stakeholders, data architects, and technical teams to design robust and scalable data models that drive analytics, reporting, and operational efficiencies within the payments ecosystem. Design, develop, and maintain conceptual, logical, and physical data models aligned with business requirements in the payments domain. Develop Persistent Data Layer (PDL) and Logical Data Layer (LDL) schemas ensuring data integrity, consistency, and optimal performance. Leverage deep functional knowledge of payments systems, including payment gateways, card processing, cashflows, transaction flows, settlement, and reconciliation processes. Work with business analysts, data architects, and developers to translate business needs into data model designs that support analytics, BI, and reporting solutions. Ensure adherence to data governance policies, standards, and best practices, including data security and compliance requirements relevant to payments. Maintain comprehensive documentation of data models, data dictionaries, metadata, and technical specifications. Support ETL teams and data engineers in data mapping, lineage, and integration tasks related to payments data. Analyze and troubleshoot data-related issues and work towards continuous improvement of data models and related processes. Engage with cross-functional teams and external vendors to ensure alignment and understanding of payments data requirements.

Posted 9 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Please Find The Job Description For This Role Below – Candidate should have a mix of Business analytics and DE with at least 5-7 years of experience Candidate should have a strong understanding of ETL concepts, experience with Spark and should be good at data modeling and data warehousing concepts

Posted 9 hours ago

Apply

56.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Join our transformation within the RMG Data Engineering team in Hyderabad and you will have the opportunity to work with a collaborative and dynamic network of technologists. Our teams play a pivotal role in implementing data products, creating impactful visualizations, and delivering seamless data solutions to downstream systems. At Macquarie, our advantage is bringing together diverse people and empowering them to shape all kinds of possibilities. We are a global financial services group operating in 31 markets and with 56 years of unbroken profitability. You’ll be part of a friendly and supportive team where everyone - no matter what role - contributes ideas and drives outcomes. What role will you play? In this role, you will apply your expertise in big data technologies and DevOps practices to design, develop, deploy, and support data assets throughout their lifecycle. You’ll establish templates, methods, and standards while managing deadlines, solving technical challenges, and improving processes. A growth mindset, passion for learning, and adaptability to innovative technologies will be essential to your success. What You Offer Hands-on experience building, implementing, and enhancing enterprise-scale data platforms. Proficiency in big data with expertise in Spark, Python, Hive, SQL, Presto, storage formats like Parquet, and orchestration tools such as Apache Airflow. Knowledgeable in cloud environments (preferably AWS), with an understanding of EC2, S3, Linux, Docker, and Kubernetes. ETL Tools: Proficient in Talend, Apache Airflow, DBT, and Informatica, AWS Glue. Data Warehousing: Experience with Amazon Redshift and Ateina. Kafka Development Engineering: Experience with developing and managing streaming data pipelines using Apache Kafka. We love hearing from anyone inspired to build a better future with us, if you're excited about the role or working at Macquarie we encourage you to apply. What We Offer Benefits At Macquarie, you’re empowered to shape a career that’s rewarding in all the ways that matter most to you. Macquarie employees can access a wide range of benefits which, depending on eligibility criteria, include: 1 wellbeing leave day per year 26 weeks’ paid maternity leave or 20 weeks’ paid parental leave for primary caregivers along with 12 days of paid transition leave upon return to work and 6 weeks’ paid leave for secondary caregivers Company-subsidised childcare services 2 days of paid volunteer leave and donation matching Benefits to support your physical, mental and financial wellbeing including comprehensive medical and life insurance cover, the option to join parental medical insurance plan and virtual medical consultations extended to family members Access to our Employee Assistance Program, a robust behavioural health network with counselling and coaching services Access to a wide range of learning and development opportunities, including reimbursement for professional membership or subscription Hybrid and flexible working arrangements, dependent on role Reimbursement for work from home equipment About Technology Technology enables every aspect of Macquarie, for our people, our customers and our communities. We’re a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications and designing tomorrow’s technology solutions. Our commitment to diversity, equity and inclusion We are committed to fostering a diverse, equitable and inclusive workplace. We encourage people from all backgrounds to apply and welcome all identities, including race, ethnicity, cultural identity, nationality, gender (including gender identity or expression), age, sexual orientation, marital or partnership status, parental, caregiving or family status, neurodiversity, religion or belief, disability, or socio-economic background. We welcome further discussions on how you can feel included and belong at Macquarie as you progress through our recruitment process. Our aim is to provide reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know in the application process.

Posted 9 hours ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview PepsiCo Data BI & Integration Platforms is seeking a Midlevel TIBCO Messaging (EMS, Business Works) Platform technology leader, responsible for overseeing the deployment, and maintenance of on-premises and cloud infrastructure (AWS/Azure) for its North America PepsiCo Foods/Beverages business. The ideal candidate will have hands-on experience in managing and maintaining TIBCO EMS (Enterprise Messaging System) and TIBCO Business Works (BW) platforms, ensuring system stability, security, and optimal performance including - Infrastructure as Code (IaC), platform provisioning & administration, network design, security principles and automation. Responsibilities TIBCO platform administration Install, configure, upgrade, and maintain TIBCO EMS servers and BW environments. Deploy and manage TIBCO applications, including BW projects and integrations Monitoring system health and performance, identifying and resolving issues, and ensuring smooth operation of business processes. Tuning system parameters, optimizing resource utilization, and ensuring the efficient operation of applications. Collaborating with development, QA, and other teams to resolve technical issues and ensure seamless integration of applications. Developing scripts and automating tasks for administration and maintenance purposes. Configuring and managing adapters for seamless integration with various systems. Developing and managing Hawk rulebases for monitoring BW engines, adapters, and log files Cloud Infrastructure & Automation Implement and support TIBCO application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Implement TIBCO cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based TIBCO infrastructure using Azure/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of TIBCO cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AWS, TIBCO). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or Azure/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications Bachelor’s degree in computer science. At least 6 to 8 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 4 years in a technical leadership role. Thorough knowledge of TIBCO EMS, BW, and related components (e.g., Adapters, Hawk). Strong understanding of Unix/Linux operating systems, as TIBCO products often run on these platforms. Proficiency in enterprise messaging concepts, including queues, topics, and message Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Strong expertise in Azure/AWS messaging technologies, real time data ingestion, data warehouses, serverless ETL, DevOps, Kubernetes, virtual machines, monitoring and security tools. Strong expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS platform administration, networking and security are preferred. TIBCO Certified Professional certifications (e.g., TIBCO EMS Administrator) are often desirable. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.

Posted 10 hours ago

Apply

6.0 years

0 Lacs

India

On-site

About YipitData: YipitData is the leading market research and analytics firm for the disruptive economy and recently raised up to $475M from The Carlyle Group at a valuation over $1B. We analyze billions of alternative data points every day to provide accurate, detailed insights on ridesharing, e-commerce marketplaces, payments, and more. Our on-demand insights team uses proprietary technology to identify, license, clean, and analyze the data many of the world’s largest investment funds and corporations depend on. For three years and counting, we have been recognized as one of Inc’s Best Workplaces . We are a fast-growing technology company backed by The Carlyle Group and Norwest Venture Partners. Our offices are located in NYC, Austin, Miami, Denver, Mountain View, Seattle , Hong Kong, Shanghai, Beijing, Guangzhou, and Singapore. We cultivate a people-centric culture focused on mastery, ownership, and transparency. Why You Should Apply NOW: You’ll be working with many strategic engineering leaders within the company. You’ll report directly to the Director of Data Engineering. You will help build out our Data Engineering team presence in India. You will work with a Global team. You’ll be challenged with a lot of big data problems. About The Role: We are seeking a highly skilled Senior Data Engineer to join our dynamic Data Engineering team. The ideal candidate possesses 6-8 years of data engineering experience. An excellent candidate should have a solid understanding of Spark and SQL, and have data pipeline experience. Hired individuals will play a crucial role in helping to build out our data engineering team to support our strategic pipelines and optimize for reliability, efficiency, and performance. Additionally, Data Engineering serves as the gold standard for all other YipitData analyst teams, building and maintaining the core pipelines and tooling that power our products. This high-impact, high-visibility team is instrumental to the success of our rapidly growing business. This is a unique opportunity to be the first hire in this team, with the potential to build and lead the team as their responsibilities expand. This is a hybrid opportunity based in India. During training and onboarding, we will expect several hours of overlap with US working hours. Afterward, standard IST working hours are permitted with the exception of 1-2 days per week, when you will join meetings with the US team. As Our Senior Data Engineer You Will: Report directly to the Senior Manager of Data Engineering, who will provide significant, hands-on training on cutting-edge data tools and techniques. Build and maintain end-to-end data pipelines. Help with setting best practices for our data modeling and pipeline builds. Create documentation, architecture diagrams, and other training materials. Become an expert at solving complex data pipeline issues using PySpark and SQL. Collaborate with stakeholders to incorporate business logic into our central pipelines. Deeply learn Databricks, Spark, and other ETL toolings developed internally. You Are Likely To Succeed If: You hold a Bachelor’s or Master’s degree in Computer Science, STEM, or a related technical discipline. You have 6+ years of experience as a Data Engineer or in other technical functions. You are excited about solving data challenges and learning new skills. You have a great understanding of working with data or building data pipelines. You are comfortable working with large-scale datasets using PySpark, Delta, and Databricks. You understand business needs and the rationale behind data transformations to ensure alignment with organizational goals and data strategy. You are eager to constantly learn new technologies. You are a self-starter who enjoys working collaboratively with stakeholders. You have exceptional verbal and written communication skills. Nice to have: Experience with Airflow, dbt, Snowflake, or equivalent. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life and we mean it. We offer vacation time, parental leave, team events, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer. Job Applicant Privacy Notice

Posted 10 hours ago

Apply

0 years

0 Lacs

Vishakhapatnam, Andhra Pradesh, India

On-site

Company Description RASAPOORNA FOODS PRIVATE LIMITED is a company based in Visakhapatnam, Andhra Pradesh, India. We specialize in providing high-quality food products and services to our clients. Our company operates from its headquarters at HARI PRIYA HEAVEN, KRM COLONY, Seethammadharain Vishakhapatnam. We are dedicated to maintaining a high standard of excellence in our offerings and operations. Role Description We are seeking a full-time Power BI Consultant to join our team in Vishakhapatnam. This on-site role involves designing, developing, and maintaining Power BI dashboards and reports. You will be responsible for data modeling, creating ETL processes, and supporting data warehousing initiatives. The role includes analyzing business requirements, creating data visualizations, and providing insights to support decision-making. Qualifications Strong Analytical Skills Experience with Extract Transform Load (ETL) processes Proficiency in creating dashboards using Power BI Expertise in Data Modeling and Data Warehousing Excellent problem-solving and communication skills Ability to work independently and collaborate with cross-functional teams Bachelor’s degree in Computer Science, Information Technology, or a related field Experience in the food industry is a plus

Posted 10 hours ago

Apply

3.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About SpurQLabs: SpurQLabs is an independent software testing and test automation company with a mission to help our clients build exceptional quality products at speed. We specialize in test automation, performance testing, API testing, and CI/CD enablement across industries including life sciences, pharmaceuticals, and regulated environments. Job Summary: We are seeking a detail-oriented ETL Test Engineer with strong Python, SQL, and AWS skills to validate data pipelines in compliance with GxP and FDA regulations . This role involves test planning, test execution, automated validation, and ensuring high-quality, audit-ready data workflows. Key Responsibilities: Develop and execute test plans, test cases, and test scripts to validate ETL processes, data migrations, and transformations according to GxP and industry standards. Conduct functional, integration, and regression testing across various data sources and targets to ensure accurate extraction, transformation, and loading. Collaborate with data engineers, business analysts, and stakeholders to understand data mappings, business logic, and compliance needs . Build and maintain automated ETL test suites using Python and testing frameworks for continuous validation of data pipelines. Perform data profiling and quality assessments , identify discrepancies, and work with stakeholders to resolve integrity issues. Document and report test outcomes, validation findings, and defects using defined templates and issue tracking tools. Participate in validation planning, execution, and documentation aligned with regulatory guidelines, GxP, FDA, and company SOPs. Ensure traceability, auditability, and data integrity across all validation activities. Stay current on industry trends, compliance updates, and best practices in ETL testing and data validation. Contribute to process improvement and knowledge sharing within the team. Technical Skills: Mandatory: Python : For automation of ETL validation SQL : Strong skills for data querying and validation AWS Cloud Services : Especially S3 and Databricks Snowflake : Hands-on experience with cloud data warehouse Nice to Have: Experience with automated ETL testing frameworks Familiarity with data compliance frameworks (GxP, FDA, Part 11) Exposure to validation documentation tools and issue tracking systems Qualifications: Bachelor’s degree in Computer Science, Engineering, Life Sciences, or a related field 3 to 6 years of hands-on experience in ETL testing in a regulated or data-intensive environment Experience in GxP-compliant environments is strongly preferred Strong communication, analytical, and problem-solving skills

Posted 10 hours ago

Apply

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

You are as unique as your background, experience and point of view. Here, you’ll be encouraged, empowered and challenged to be your best self. You'll work with dynamic colleagues - experts in their fields - who are eager to share their knowledge with you. Your leaders will inspire and help you reach your potential and soar to new heights. Every day, you'll have new and exciting opportunities to make life brighter for our Clients - who are at the heart of everything we do. Discover how you can make a difference in the lives of individuals, families and communities around the world. Job Description: Sun Life Job Description Of The Data Modeler Role The Data Modeler will work towards design and implementation of new data structures to support the project teams delivering on ETL, Data warehouse design, managing the enterprise data model, the maintenance of the data, and enterprise data integration approaches. Technical Responsibilities Build and maintain out of standards data models to report disparate data sets in a reliable, consistent and interpretable manner. Gather, distil and harmonize data requirements and to design coherent Conceptual, logical and physical data models and associated physical feed formats to support these data flows. Articulate business requirements and build source-to-target mappings having complex ETL transformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysis and database design - Transactional and Dimensional data modelling. Normalize/ De-normalize data structures, introduce hierarchies and inheritance wherever required in existing/ new data models. Develop and implement data warehouse projects independently. Work with data consumers and data suppliers to understand detailed requirements, and to propose standardized data models. Contribute to improving the Data Management data models. Be an influencer to present and facilitate discussions to understand business requirements and develop dimension data models based on these capabilities and industry best practices. Requirements Extensive practical experience in Information Technology and software development projects of with at least 8+ years of experience in designing Operational data store & data warehouse. Extensive experience in any of Data Modelling tools – Erwin/ SAP power designer. Strong understanding of ETL and data warehouse concepts processes and best practices. Proficient in Data Modelling including conceptual, logical and physical data modelling for both OLTP and OLAP. Ability to write complex SQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination of solid business knowledge and technical expertise with strong communication skills. Demonstrate excellent analytical and logical thinking. Good verbal & written communication skills and Ability to work independently as well as in a team environment providing structure in ambiguous situation. Good to have Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding of Master Data Management, Data Quality and Data Governance. Basic understanding of data visualization tools like SAS VA, Tableau Good understanding of implementing & architecting data solutions using the informatica, SQL server/Oracle Job Category: Advanced Analytics Posting End Date: 16/09/2025

Posted 10 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Hello! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Growth Strategy Team at Innovaccer Innovaccer is forming a new strategic advisory team that will support healthcare organizations to better understand their opportunities and levers for maximizing outcomes, particularly in, but not limited to, value-based care arrangements and population health initiatives. This role requires a "full stack" approach to analytics, covering all parts of the analytics value chain, including data ETL and manipulation, analysis, reporting, visualizations, insights, and final deliverable creation. The ideal candidate will possess a player / coach mentality as this team matures, with the willingness and ability to roll up their sleeves and contribute in the early days and transition to growing in responsibility as we scale. This candidate will be comfortable diving into both structured and unstructured data, creating robust financial models and business cases, producing compelling visualizations and collateral, and leading the narrative on data-driven storytelling. About The Role We are looking for a Senior Manager -Advisory Services, a key role within the Advisory Services team at Innovaccer. This individual will be responsible for delivering key customer analytics (e.g. ROI models), performance analytics and slide presentations to support multiple client pursuits and engagements. The ideal candidate has a strong desire to learn about the US healthcare system, is organized and structured, has excellent written and verbal communication skills and is a fast learner. The role requires both analytical skills and creativity to articulate and communicate complex messages about healthcare and technology to a wide-ranging audience. You will be aligned with a Managing Director/Director in the US who will provide you direction on day to day work and help you learn about the company and the industry. A Day in the Life Under direction of Advisory Services leaders, engage with prospect organizations on intended business outcomes and request data assets to model potential scenarios Own, digest, and interpret data from a variety of forms, aggregated metrics in spreadsheets to unstructured formats to raw, transactional forms like medical claims Own and execute the entire analytics lifecycle, leveraging data in all its available forms to produce cogent and compelling business cases, financial models, presentations, and other executive-ready final deliverables Synthesize insights to inform strategic direction, roadmap creation, and opportunities Couple Innovaccer's technology platform-including data, software and workflow applications, analytics, and AI-with identified insights and opportunities to create prescriptive recommendations that maximize value creation and outcomes Develop findings and insights for senior leadership of prospects and clients and Innovaccer stakeholders in a clear and compelling manner Stay up-to-date with the latest analytics technologies and methodologies to enhance capabilities Build compelling presentations including client sales and engagement delivery decks, case studies, talk tracks, and visuals. Research and analyze high priority strategic clients, industry best practices and market intelligence, including industry mapping, customer profiling, competitive insights and deep dives into select solution opportunities Co-develop and maintain standardized value lever framework, segment-based pitch decks and customer case studies for use across multiple advisory pursuits and engagements Provide analytics thought partnership and data support on the design, execution, and measurement of impactful advisory services strategy initiatives Collaborate across Advisory Services, Growth Strategy, Marketing, Sales, Product, and Customer Success teams and business leaders to address business questions that can be answered effectively through data-driven modeling and insights Develop slide presentations for quarterly and annual reporting presentations Structure, manage, and write responses to RFPs What You Need Degree from a Tier 1 college with relevant degrees in Finance, Economics, Statistics, Business, or Marketing. 3-5 years of professional experience, including experience in management consulting and/or Go To Market in a technology/ software/SAAS company Strong technical aptitude, fantastic storytelling skills, with a great track record of working across sales, marketing, and technology teams Ability to identify, source, and include data elements to drive analytical models and outputs. Experience creating Excel models (identify inputs, key considerations/variables, relevant outputs) and PowerPoint presentations Familiarity with leveraging AI tools (e.g., generative AI, AI-enhanced research tools, AI-based data analysis platforms) to enhance productivity, accelerate research, generate insights, and support creative problem-solving Proactive, decisive, independent thinker and good at problem solving and conducting industry research Experience making slide presentations for internal and external audiences that articulate key takeaways Creative problem solver with the ability to back up ideas with requisite fact-based arguments Comfortable working with multiple data sources in both structured data and unstructured formats to frame a business opportunity and develop a structured path forward Strong proficiency in Excel and PowerPoint or G-Suite Willing to work in a fast-paced environment under tight deadlines Strong written and verbal communication skills, as well as the ability to manage cross-functional stakeholders Experience with analytics and financial modeling US Healthcare experience and/or a strong willingness and interest to learn this space. Specific areas of interest include: Understanding of payer/provider / patient dynamics Provider data strategy and architecture Provider advanced analytics, AI, NLP Patient experience and engagement Population Health and Care Management Utilization and cost management Risk and Quality Management Population Health Management Risk models Value-Based Care Social Determinants of Health We offer competitive benefits to set you up for success in and outside of work. Here's What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where And How We Work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details.

Posted 10 hours ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Business Intelligence- Manager Location: Mumbai About Us StayVista is India’s largest villa hospitality brand and has redefined group getaways. Our handpicked luxury villas are present in every famous holiday destination across the country. We curate unique experiences paired with top-notch hospitality, creating unforgettable stays. Here, you will be a part of our passionate team, dedicated to crafting exceptional getaways and curating one-of-a-kind homes. We are a close-knit tribe, united by a shared love for travel and on a mission to become the most loved hospitality brand in India. Why Work With Us? At StayVista, you're part of a community where your ideas and growth matter. We’re a fast-growing team that values continuous improvement. With our skill upgrade programs, you’ll keep learning and evolving, just like we do. And hey, when you’re ready for a break, our villa discounts make it easy to enjoy the luxury you help create. Your Role As an Manager – Business Intelligence, you will lead data-driven decision-making by transforming complex datasets into strategic insights. You will optimize data pipelines, automate workflows, and integrate AI-powered solutions to enhance efficiency. Your expertise in database management, statistical analysis, and visualization will support business growth, while collaboration with leadership and cross-functional teams will drive impactful analytics strategies. About You 8+ years of experience in Business Intelligence, Revenue Management, or Data Analytics, with a strong ability to turn data into actionable insights. Bachelor’s or Master’s degree in Business Analytics, Data Science, Computer Science, or a related field. Skilled in designing, developing, and implementing end-to-end BI solutions to improve decision-making. Proficient in ETL processes using SQL, Python, and R, ensuring accurate and efficient data handling. Experienced in Google Looker Studio, Apache Superset, Power BI, and Tableau to create clear, real-time dashboards and reports. Develop, Document & Support ETL mappings, Database structures and BI reports. Develop ETL using tools such as Pentaho/Talend or as per project requirements. Participate in the UAT process and ensure quick resolution of any UAT issue or data issue. Manage different environments and be responsible for proper deployment of reports/ETLs in all client environments. Interact with Business and Product team to understand and finalize the functional requirements Responsible for timely deliverables and quality Skilled at analyzing industry trends and competitor data to develop effective pricing and revenue strategies. Demonstrated understanding of data warehouse concepts, ETL concepts, ETL loading strategy, data archiving, data reconciliation, ETL error handling, error logging mechanism, standards and best practices Cross-functional Collaboration Partner with Product, Marketing, Finance, and Operations to translate business requirements into analytical solutions. Key Metrics: what you will drive and achieve Data Driven Decision Making &Business Impact. Revenue Growth & Cost Optimization. Cross-Functional Collaboration & Leadership Impact BI & Analytics Efficiency and AI Automation Integration

Posted 10 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Experience :7-9 yrs Experience in AWS services must like S3, Lambda , Airflow, Glue, Athena, Lake formation ,Step functions etc. Experience in programming in JAVA and Python. Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Nice To Have Experience in a Big Data technologies (Terradata, Snowflake, Spark, Redshift, Kafka, etc.) Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on AWS using Glue. Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.

Posted 10 hours ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Where Data Does More. Join the Snowflake team. We are looking for people who have a strong background in data science and cloud architecture to join our AI/ML Workload Services team to create exciting new offerings and capabilities for our customers! This team within the Professional Services group will be working with customers using Snowflake to expand their use of the Data Cloud to bring data science pipelines from ideation to deployment, and beyond using Snowflake's features and its extensive partner ecosystem. The role will be highly technical and hands-on, where you will be designing solutions based on requirements and coordinating with customer teams, and where needed Systems Integrators. AS A SOLUTIONS ARCHITECT - AI/ML AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake in relation to the AI/ML workload Build, deploy and ML pipelines using Snowflake features and/or Snowflake ecosystem partner tools based on customer requirements Work hands-on where needed using SQL, Python, Java and/or Scala to build POCs that demonstrate implementation techniques and best practices on Snowflake technology within the Data Science workload Follow best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Maintain deep understanding of competitive and complementary technologies and vendors within the AI/ML space, and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing OUR IDEAL SOLUTION ARCHITECT - AI/ML WILL HAVE: Minimum 10 years experience working with customers in a pre-sales or post-sales technical role Skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Thorough understanding of the complete Data Science life-cycle including feature engineering, model development, model deployment and model management. Strong understanding of MLOps, coupled with technologies and methodologies for deploying and monitoring models Experience and understanding of at least one public cloud platform (AWS, Azure or GCP) Experience with at least one Data Science tool such as AWS Sagemaker, AzureML, Dataiku, Datarobot, H2O, and Jupyter Notebooks Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala. Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar University degree in computer science, engineering, mathematics or related fields, or equivalent experience BONUS POINTS FOR HAVING: Experience with Databricks/Apache Spark Experience implementing data pipelines using ETL tools Experience working in a Data Science role Proven success at enterprise software Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 10 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies