Home
Jobs

3305 Hive Jobs - Page 47

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job description Job Title: AI Engineer Salary: 4 - 5.4 LPA Experience: Minimum 2 years Location: Hinjewadi, Pune Work Mode: Work from Office Availability: Immediate Joiner About Us: Rasta.AI, a product of AI Unika Technologies (P) Ltd, is a pioneering technology company based in Pune. We specialize in road infrastructure monitoring and maintenance using cutting-edge AI, computer vision, and 360-degree imaging. Our platform delivers real-time insights into road conditions to improve safety, efficiency, and sustainability. We collaborate with government agencies, private enterprises, and citizens to enhance road management through innovative tools and solutions. The Role This is a full-time, on-site role. As an AI Engineer, you will be responsible for developing innovative AI models and software solutions to address real-world challenges. You will collaborate with cross-functional teams to identify business opportunities and provide customized solutions. You will also work alongside talented engineers, designers, and data scientists to implement and maintain these models and solutions. Technical Skills Programming Languages: Python (and other AI-supported languages) Databases: SQL, Cassandra, MongoDB Python Libraries: NumPy, Pandas, Scikit-learn Deep Neural Networks: CNN, RNN, and LLM Data Analysis Libraries: TensorFlow, Pandas, NumPy, Scikit-learn, Matplotlib, Tensor Board Frameworks: Django, Flask, Pyramid, and Cherrypie Operating Systems: Ubuntu, Windows Tools: Jupyter Notebook, PyCharm IDE, Excel, Roboflow Big Data (Bonus): Hadoop (Hive, Sqoop, Flume), Kafka, Spark Code Repository Tools: Git, GitHub DevOps-AWS: Docker, Kubernetes, Instance hosting and management Analytical Skills Exploratory Data Analysis Predictive Modeling Text Mining Natural Language Processing Machine Learning Image Processing Object Detection Instance Segmentation Deep Learning DevOps AWS Knowledge Expertise Proficiency in TensorFlow library with RNN and CNN Familiarity with pre-trained models like VGG-16, ResNet-50, and Mobile Net Knowledge of Spark Core, Spark SQL, Spark Streaming, Cassandra, and Kafka Designing and Architecting Hadoop Applications Experience with chat-bot platforms (a bonus) Responsibilities The entire lifecycle of model development: Data Collection and Preprocessing Model Development Model Training Model Testing Model Validation Deployment and Maintenance Collaboration and Communication Qualifications Bachelor's or Master's degree in a relevant field (AI, Data Science, Computer Science, etc.) Minimum 2 years of experience developing and deploying AI-based software products Strong programming skills in Python (and potentially C++ or Java) Experience with machine learning libraries (TensorFlow, PyTorch, Kera's, scikit-learn) Experience with computer vision, natural language processing, or recommendation systems Experience with cloud computing platforms (Google Cloud, AWS) Problem-solving skills Excellent communication and presentation skills Experience with data infrastructure and tools (SQL, NoSQL, and big data platforms) Teamwork skills Join Us! If you are passionate about AI and want to contribute to groundbreaking projects in a dynamic startup environment, we encourage you to apply! Be part of our mission to drive technological advancement in India. Drop Your CV - hr@aiunika.com Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Data Engineer - 6months to 2yrs exp 2023-2024 graduates ONLY Company Description Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description We are seeking a Data Engineer with a strong background in data engineering. This role involves managing system requirements, design, development, integration, quality assurance, implementation, and maintenance of corporate applications. Ø Work with product owners, business stakeholders and internal teams to understand business requirements and desired business outcomes. Ø Assist in scoping and designing analytic data assets, implementing modelled attributes and contributing to brainstorming sessions. Ø Build and maintain a robust data engineering process to develop and implement self-serve data and tools for Visa’s product management teams and data scientists. Ø Find opportunities to create, automate and scale repeatable analyses or build self-service tools for business users. Ø Execute data engineering projects ranging from small to large either individually or as part of a project team. Ø Set the benchmark in the team for good data engineering practices and assist leads and architects in solution design. Ø Exhibit a passion for optimizing existing solutions and making incremental improvements. This is a hybrid position. Expectation of days in office will be confirmed by your hiring manager. Qualifications Basic Qualification -Bachelors degree, OR 3+ years of relevant work experience Preferred Qualification -Minimum of 1 years’ experience in building data engineering pipelines. -Design and coding skills with Big Data technologies like Hadoop, Spark, Hive and Map reduce. -Mastery in Pyspark or Scala. -Expertise in any programming like Java or Python. Knowing OOP concepts like inheritance, polymorphism and implementing Design Patterns in programming is needed. -Experience with cloud platforms like AWS, GCP, or Azure is good to have. -Excellent problem-solving skills and ability to think critically. -Experience with any one ETL tool like Informatica, SSIS, Pentaho or Azure Data Factory. -Knowledge of successful design, and development of data driven real time and batch systems. -Experience in data warehousing and an expert in any one of the RDBMS like SQL Server, Oracle, etc. -Nice to have reporting skills on PowerBI/Tableau/QlikView. -Strong understanding of cloud architecture and service offerings including compute, storage, databases, networking, AI, and ML. -Passionate about delivering zero defect code that meets or exceeds the proposed defect SLA and have high sense of accountability for quality and timelines on deliverables. -Experience developing as part of Agile/Scrum team is preferred and hands on with Jira. -Understanding basic CI/ CD functionality and Git concepts is must. Additional information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role : PySpark Developer Locations : Hyderabad & Bangalore Work Mode : Hybrid Interview Mode : Virtual (2 Rounds) Type : Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Show more Show less

Posted 2 weeks ago

Apply

175.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? Sales Enablement Organization focuses on accelerating commercial business growth through training, tools and insights to provide a best-in-class customer experience and create a culture of doing it the right way. Sales Ops & Governance Role This position will support the development and implementation of analytical solutions, to provide consultative support to the GCS leadership team. The incumbent will also highlight trends, risks, and opportunities to enhance business decision-making processes, while working very closely with Sales, Marketing, Capabilities, Technology, and Analytics teams to drive growth in the sales organization. Key Responsibilities: · Perform in-depth data analysis to deliver strategic priorities focused on the sales enablement roadmap for Small/Medium Business · Have outstanding knowledge of Python, SQL, and Hive, encompassing data manipulation and statistical modeling/data-mining techniques · Ability to work with huge unstructured data, apply analytical thinking to diagnose business needs and establish analytical hypothesis and solutions · Analyze, deep dive, explore to identify data gaps and problem solve them by collaborating across teams. · Detailed execution of the development, validation and implementation of automated analytical solutions with minimal to no manual intervention · Leverage predictive modeling to identify tactics for channel optimization of existing areas and conceptualize opportunities. · Challenge status quo, innovate, and harbor strong curiosity. Proactively identify opportunities to improve processes by evaluating and challenging existing approaches · Effectively challenge the conceptual soundness, theory, approach, and usages of predictive models Minimum Qualifications · 3+ years of Database Architecture & Administration experience in a professional environment · Bachelor’s Degree required, preferably in a quantitative field (e.g., Economics, Finance, Accounting, Statistics, Artificial Intelligence, Data Analytics, Engineering) · Must Have - High proficiency in Python and SQL, with strong working knowledge of analytical tools (e.g., Hive, PySpark, scikit-learn etc.) · Programming: SQL, SAS, Python/R, Unix scripting, Excel/VBA · Experience in Big Data environment, inclusive of data mining techniques. · Experience applying advanced statistical and/or quantitative techniques to solve business problems · Hands-on analytics and machine learning (ML) experience with understanding of data processing and model validation. · Ability to address performance issues and to manipulate both structured and unstructured data · Advanced knowledge of Microsoft Office Suite (Excel pivot, macros, deck-writing) · Ability to cultivate relationships and partner with multiple collaborators, with superb interpersonal and communication skills · Ability to deliver results, work independently, and prioritize tasks · Self-starter who thrives in an evolving, dynamic environment Preferred Qualifications · Proficiency in CRM tools, Salesforce, or statistical software programs · Big data platform (Hadoop, SPARK, NoSQL DB, RDBMS) · Cloud Products & Services like Google Cloud · Visualization: Tableau, Power BI, Power Automate, Splunk · Servicing Platforms like Service Now · Others: Confluence, Sharepoint or any other workflow and content management tool We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Big Data Tester About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job Title: Big Data Tester Job Description: Role: • Support Development, and maintain automated test frameworks, tools, and test cases for Data Engineering and Data Warehouse applications. • Collaborate with cross-functional teams, including software developers, data engineers, and data analysts, to ensure comprehensive testing coverage and adherence to quality standards. • Conduct thorough testing of data pipelines, ETL processes, and data transformations using Big Data technologies. • Apply your knowledge of Data Warehouse/Data Lake methodologies and best practices to validate the accuracy, completeness, and performance of our data storage and retrieval systems. • Identify, document, and track software defects, working closely with the development team to ensure timely resolution. • Participate in code reviews, design discussions, and quality assurance meetings to provide valuable insights and contribute to the overall improvement of our software products. Base Skill Requirements: Must Technical • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 3-5 years of experience in software testing and development, with a focus on data-intensive applications. • Proven experience in testing data pipelines and ETL processes - Test planning, Test Environment planning, End to End testing, Performance testing. • Solid programming skills in Python - proven automation effort to bring efficiency in the test cycles. • Solid understanding of Data models and SQL . • Must have experience with ETL (Extract, Transform, Load) processes and tools (Scheduling and Orchestration tools, ETL Design understanding) • Good understanding of Big Data technologies like Spark, Hive, and Impala. • Understanding of Data Warehouse methodologies, applications, and processes. • Experience working in an Agile/Scrum environment, with a solid understanding of user stories, acceptance criteria, and sprint cycles. Optional Technical • Experience with scripting languages like Bash or Shell. • Experience working with large-scale datasets and distributed data processing frameworks (e.g., Hadoop, Spark). • Familiarity with data integration tools like Apache NiFi is a plus. • Excellent problem-solving and debugging skills, with a keen eye for detail. • Strong communication and collaboration skills to work effectively in a team-oriented environment. • Eagerness to learn and contribute to a growing team. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Thane, Maharashtra, India

On-site

Linkedin logo

Job Requirements Role/ Job Title: Data Engineer - Gen AI Function/ Department: Data & Analytics Place of Work: Mumbai Job Purpose The data engineer will be working with our data scientists who are building solutions in domain of text, audio and images and tabular data. They will be responsible for working with large volumes of structured and unstructured data in its storage, retrieval and augmentation. Job & Responsibilities Build data engineering pipeline focused on unstructured data pipelines Conduct requirements gathering and project scoping sessions with subject matter experts, business users, and executive stakeholders to discover and define business data needs. Design, build, and optimize the data architecture and extract, transform, and load (ETL) pipelines to make them accessible for Data Scientists and the products built by them. Work on end-to-end data lifecycle from Data Ingestion, Data Transformation and Data Consumption layer, versed with API and its usability Drive the highest standards in data reliability, data integrity, and data governance, enabling accurate, consistent, and trustworthy data sets A suitable candidate will also demonstrate experience with big data infrastructure inclusive of MapReduce, Hive, HDFS, YARN, HBase, MongoDB, DynamoDB, etc. Creating Technical Design Documentation of the projects/pipelines Good skills in technical debugging of the code in case of issues. Also, working with git for code versioning Education Qualification Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-Graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA Experience Range : 5-10 years of relevant experience Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Company : Our client is a global IT, consulting, and business process services company headquartered in Bengaluru, India. It offers end-to-end IT services, including application development, infrastructure management, and digital transformation. They serves clients across industries such as banking, healthcare, retail, energy, and manufacturing. It specializes in modern technologies like cloud computing, AI, data analytics, and cybersecurity. The company has a strong global presence, operating in over 66 countries. Our client employs more than 250,000 people worldwide. It is known for helping enterprises modernize their IT infrastructure and adopt agile practices. Their division includes consulting, software engineering, and managed services. The company integrates automation and AI into its services to boost efficiency and innovation. Job Title: Data Engineer(Python,Spark, Scala) · Location: Hyderabad(Hybrid) · Experience: 5+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: We are seeking a skilled and motivated Data Engineer with hands-on experience in Python, Apache Spark, and Scala to join our fast-growing data engineering team. The ideal candidate will be responsible for building scalable data pipelines, optimizing data processing frameworks, and supporting advanced analytics and machine learning workflows. Key Responsibilities: Design, build, and maintain large-scale, distributed data processing systems using Apache Spark. Develop robust ETL/ELT pipelines in Python and Scala to ingest, clean, and transform data from multiple sources. Collaborate with data scientists, analysts, and software engineers to support data infrastructure needs. Ensure high performance and reliability of data workflows by implementing best practices in code and system design. Monitor and troubleshoot data jobs, identify bottlenecks, and propose solutions for optimization. Work with structured and unstructured data, using tools and technologies like Hadoop, Hive, Parquet, Kafka, etc. Participate in code reviews and contribute to a culture of continuous improvement and knowledge sharing. Required Qualifications: 5+ years of experience in data engineering or big data development. Strong proficiency in Python for data processing and automation. Solid experience with Apache Spark (RDD, DataFrame, SparkSQL, and performance tuning). Proficient in Scala for building Spark applications. Hands-on experience with distributed data systems (Hadoop, HDFS, Hive). Experience with version control (e.g., Git), CI/CD, and cloud platforms (AWS, Azure, or GCP) is a plus. Strong understanding of data modeling, data warehousing, and data quality best practices. Preferred Skills: Experience with streaming data (Spark Streaming, Kafka). Knowledge of containerization tools like Docker and orchestration with Kubernetes. Familiarity with workflow orchestration tools (Airflow, Luigi). Exposure to machine learning pipelines and MLOps concepts. Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Show more Show less

Posted 2 weeks ago

Apply

50.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organizations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services. Job Title: Data Modeler Location: Hyderabad Experience: 6+ yrs Employment Type: Contract to hire Work Mode: Hybrid Notice Period: - Immediate joiners Job Description:- • Should be capable of developing/configuring data pipelines in a variety of platforms and technologies • Possess the following technical skills – SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) • Can demonstrate strong experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc. • Experience with GCP (particularly Airflow, Dataproc, Big Query) is an advantage • Have experience with creating solutions which power AI/ML models and generative AI • Ability to work independently on specialized assignments within the context of project deliverables • Take ownership of providing solutions and tools that iteratively increase engineering efficiencies • Be capable of creating designs which help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of data pipelines • Be able to demonstrate problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge • Communicate openly and honestly using sophisticated oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount • Ability to deliver materials of the highest quality to management against tight deadlines • Ability to work effectively under pressure with competing and rapidly changing priorities Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Responsibilities includes working on MDM platforms like ETL, data modelling, data warehousing and manage database related complex analysis, design, implement and support moderate to large sized databases. The role will help in providing production support and enhance existing data assets, design and develop ETL processes. They will be responsible for design and development of ETL processes for large data warehouse. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience Experience in Master Data Management Platform like ETL or EAI, Data warehousing concepts, code management, automated testing Experience in developing ETL design guidelines, standards and procedures to ensure a manageable ETL infrastructure across the enterprise Experience in HDFS/Hive /Spark/NoSQL - Hbase Solid command on MS SQL/Oracle SQL, Mongo DB, PL/SQL and Complex Data Analysis using SQL queries. Solid knowledge of data architecture concepts Solid knowledge of reporting and analytics concepts Knowledge of Software Engineering best practices with experience on implementing CI/CD using Jenkins Knowledge of the Agile methodology for delivering software solutions Preferred Qualifications Development experience in Big Data eco system with the ability to design, develop, document & architect Hadoop applications Skills in SQL Server DB and Windows server file handling and Power Shell scripting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. United Health Group is a leading health care company serving more than 85 million people worldwide. The organization is ranked 5th among Fortune 500 companies. UHG serves its customers through two different platforms - United Health Care (UHC) and Optum. UHC is responsible for providing healthcare coverage and benefits services, while Optum provides information and technology enabled health services. India operations of UHG are aligned to Optum. The Optum Global Analytics Team, part of Optum, is involved in developing broad-based and targeted analytics solutions across different verticals for all lines of business. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors / 4 year university degree Experience - Between 5+ years of experience Development experience on Azure and Databricks Hands-on experience in developing ETL pipeline using Databricks and ADF(Azure Data Factory) Hands-on experience in Python, PySpark, Spark SQL Hands-on experience in on-premises environments to Azure Cloud Azure cloud exposure(Azure services like Virtual Machines, Load Balancer, SQL Database, Azure DNS, Blob Storage, AzureAD etc.) Good to have hands-on experience on Bigdata platform (Hadoop, Hive) and SQL scripting Good to have experience of Scala, Snowflake, Healthcare domain Good to have experience with CI/CD tools such as GitHub Action Ability to develop innovative approaches on performance optimization & automation Proven excellent verbal communication and presentation skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 09 The Team We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What You Can Expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests. Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code. Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code. Collaborate effectively with technical and non-technical stakeholders. Respond to and resolve production issues. What We Are Looking For Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools. Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required Technical Skills Build data pipelines. Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow. Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData. Write infrastructure as code to develop sandbox environments. Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed. Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable Technical Skills Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 2 - 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Consultant, Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Location - Hyderabad Provide creative input on projects across a range of industries and problem statements Contribute to the development of analytics strategies and programs for regional and global clients by leveraging data and technology solutions to unlock client value Collaborate with Mastercard team to understand clients’ needs, agenda, and risks Develop working relationship with client analysts/managers, and act as trusted and reliable partner Team Collaboration & Culture Collaborate with senior project delivery consultants to identify key findings, prepare effective presentations, and deliver recommendations to clients Independently identify trends, patterns, issues, and anomalies in defined area of analysis, and structure and synthesize own analysis to highlight relevant findings Lead internal and client meetings, and contribute to project management Contribute to the firm's intellectual capital Receive mentorship from performance analytics leaders for professional growth and development Qualifications Basic qualifications 4-6 years of overall career experience in Performance Analytics or 2-3 years post MBA/Masters experience In Performance Analytics Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings Proficiency using data analytics software (e.g., Python, R, SQL, SAS) Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience in building, managing, and maintaining database structures, working with data visualization tools (e.g., Tableau, Power BI), or working with Hadoop framework and coding using Impala, Hive, or PySpark Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Experience managing tasks or workstreams in a collaborative team environment Ability to identify problems, brainstorm and analyze answers, and implement the best solutions Relevant industry expertise Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-245747 Show more Show less

Posted 2 weeks ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Senior Technical Manager, Full Stack – C2 Location : Chennai Employement Type : Permanent Key Responsibilities Solution Design: Architect and design end-to-end data solutions, ensuring scalability, performance, and reliability. Project Leadership: Drive the delivery of the data roadmap, managing timelines, resources, and stakeholder expectations. Data Warehousing: Oversee the development and optimization of SQL warehousing processes and structures. Reporting and Analytics: Lead the creation and maintenance of dashboards and reports using Power BI and SSRS, ensuring data integrity and accessibility. ETL Processes: Design and implement ETL processes using SSIS, ensuring data is accurately transformed and loaded. Collaboration: Work closely with cross-functional teams to gather requirements and translate them into technical specifications. Mentorship: Provide technical guidance and mentorship to team members, fostering a culture of continuous learning. Best Practices: Establish and enforce best practices for data governance, quality, and security. Basic Functions 12+ Years of experience in the enterprise application design, development & support Design, develop, and maintain enterprise BI reports, visualisations and dashboards in Power BI Handling high volume SQL warehouse, Extensive knowledge on MS SQL, SSIS and Visual Report Cloud development and deployment experience on Azure Cloud Services such as Azure SQL, Data Factory and Data Bricks nice to have Synapse Good to have knowledge on Python for handling Big Data using Spark. Source code knowledge on Git,Devops, Coding champion and so on. Responsible for leading detailed design, end-to-end development (front-end and back-end)/unit testing and integration of applications & Design client-side and server-side architecture Should have people management responsibilities. Produce scalable and flexible, high-quality code that satisfies both functional and non-functional requirements. Develop, deploy, test and maintain technical assets in a highly secure and integrated enterprise computing environment & Support functional testing. Cross train & mentor team members for complete knowledge of technologies. Analyze and translate business requirements to technical design with security and data protection settings. Build features and applications with a mobile responsive design. Collaborate/communicate with on-site project team and business users as required. Work with development teams and product managers to ideate software solutions. Manage expectation regarding road blocks, proactively, in the critical path to help ensure successful delivery of the solution Own all projects deliverables and ensure proper communication between teams and quality levels; responsible for end to end solution delivery Comprehend the fundamental solution being developed/deployed – its business value, blueprint, how it fits with the overall architecture, risks, and more. Would be required to provide inputs on solution and design effort to build scalable features /functionality. Essential Functions Multi-disciplinary technologist who enjoys designing and executing Healthcare solutions. Understanding of the US Healthcare value chain and key impact drivers [Payer and/or Provider] Strong problem solving and analytical skills and the ability to “roll up your sleeves” and work with a client to create timely solutions and resolutions Ability to work on multiple product features simultaneously. Quick learner with ability to understand product’s functionality end to end. Opportunity to try out bleeding edge technologies to provide POC, which will be evaluated and put on use if approved. Experience with Strong knowledge of algorithms, design patterns and fundamental computer science concepts Experience working in Agile methodologies (SCRUM) environment and familiar with iterative development cycles. Experience implementing authentication, authorization with OAuth and use of Single Sign On, SAML based authentication. Familiarity with common stacks Primary Internal Interactions Review with the Overall Product Manager & AVP for improvements in the product development lifecycle Assessment meeting with VP & above for additional product development features. Train & Mentor the juniors in the team Primary External Interactions Communicate with onshore stakeholder & Executive Team Members. Help the Product Management Group set the product roadmap & help in identifying future sellable product features. Client Interactions to better understands expectations & steamline solutions. If required should be a bridge between the client and the technology teams. Skills Technical Skills Required Skills – Azure cloud – Azure SQL, Azure Data Factory, Data Bricks Power BI Reports and Self-Service Power BI Reports building Data Analysis Tools – (Online analytical processing (OLAP), ETL frameworks) & ETL Data Load SQL Server 2008 & above: SQL, Stored procedures, functions Must have experience on cloud Azure hosted applications Skills Nice To Have Azure Synapse Experience on Big Data Tools, not limited to – Python, PySpark, HIVE Expertise in US Healthcare Insurance. Experience on mobile application Power BI Report Development Stack overflow account score Technical blogs & technical write-ups Experience in Cloud & NLP Technologies Certifications in Azure, Agile & Waterfall Methodologies Process Specific Skills Business Domain US Healthcare Insurance & Payer Analytics Care Coordination & Care Optimization Population Health Analytics & Risk Management Member 360 View & Analytics Gaps & Compliance Measures Payer Management & Code Classification Management Utilization & Cost Management Soft Skills Understanding of Healthcare business vertical and the business terms within Good analytical skills. Strong communication skills - oral and verbal Ability to work with various stakeholders across various geography Excellent Team player, with the ability to build & sustain teams Should be able to function as an Individual Contributor as well if required. Mentor people and create a high performing organization (foster relationships, resolve conflicts and so on while delivering performance feedback Working Hours General Shift – 11 AM to 8 PM Will be required to extend as per project release needs Education Requirements Master’s or bachelor’s degree from top tier colleges with good grades, preferably from an Engineering Background Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Site Reliability Engineer Requirements: We are seeking a proactive and technically strong Site Reliability Engineer (SRE) to ensure the stability, performance, and scalability of our Data Engineering Platform. You will work on cutting-edge technologies including Cloudera Hadoop, Spark, Airflow, NiFi, and JOB DESCRIPTIONS 2 Kubernetesensuring high availability and driving automation to support massive-scale data workloads, especially in the telecom domain. Key Responsibilities • Ensure platform uptime and application health as per SLOs/KPIs • Monitor infrastructure and applications using ELK, Prometheus, Zabbix, etc. • Debug and resolve complex production issues, performing root cause analysis • Automate routine tasks and implement self-healing systems • Design and maintain dashboards, alerts, and operational playbooks • Participate in incident management, problem resolution, and RCA documentation • Own and update SOPs for repeatable processes • Collaborate with L3 and Product teams for deeper issue resolution • Support and guide L1 operations team • Conduct periodic system maintenance and performance tuning • Respond to user data requests and ensure timely resolution • Address and mitigate security vulnerabilities and compliance issues Technical Skillset • Hands-on with Spark, Hive, Cloudera Hadoop, Kafka, Ranger • Strong Linux fundamentals and scripting (Python, Shell) • Experience with Apache NiFi, Airflow, Yarn, and Zookeeper • Proficient in monitoring and observability tools: ELK Stack, Prometheus, Loki • Working knowledge of Kubernetes, Docker, Jenkins CI/CD pipelines • Strong SQL skills (Oracle/Exadata preferred) Job Description: • Familiarity with DataHub, DataMesh, and security best practices is a plus • Strong problem-solving and debugging mindset • Ability to work under pressure in a fast-paced environment. • Excellent communication and collaboration skills. • Ownership, customer orientation, and a bias for action

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Description NielsenIQ is a global measurement and data analytics company that provides the most complete and trusted view available of consumers and markets worldwide. We provide consumer packaged goods manufacturers/fast-moving consumer goods and retailers with accurate, actionable information and insights and a complete picture of the complex and changing marketplace that companies need to innovate and grow. Our approach marries proprietary NielsenIQ data with other data sources to help clients around the world understand what’s happening now, what’s happening next, and how to best act on this knowledge. We like to be in the middle of the action. That’s why you can find us at work in over 90 countries, covering more than 90% of the world’s population. For more information, visit www.niq.com. Job Description Our NielsenIQ Technology teams are working on our new “Connect” platform, a unified, global, open data ecosystem powered by Microsoft Azure. Our clients around the world rely on NielsenIQ’s data and insights to innovate and grow. As a Software Engineer, you’ll be part of a team of smart, highly skilled technologists who are passionate about learning and prototyping cutting-edge technologies. Right now our platform is based in Scala, Snowflake, Databricks, Python and we continue to adopt the best of breed in cloud-native, low-latency technologies. We value CI/CD in everything that we develop. Our team is co-located and agile, with central technology hubs in Chicago, Toronto and Chennai. Develop new BE functionalities working closely with the FE team Contribute to the expansion of NRPS scope Qualifications We’re looking for people who have 6+ years of experience required Excellent level of experience with Python An experience in Scala and Databricks is appreciated Knowledge in Trino and Hive and Oracle would be a plus Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business Minimum B.S. degree in Computer Science, Computer Engineering or related field Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less

Posted 2 weeks ago

Apply

6.0 - 11.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

ETL Developer (Informatica power center with Big data experience) This is an on site opportunity for Jeddah ( Saudi Arabia ) Job Description Job Summary: We are seeking a skilled and experienced in Senior ETL Developer with strong expertise in Informatica PowerCenter and enterprise data warehousing concepts. The ideal candidate will design, develop, and maintain scalable ETL pipelines, ensuring data integrity and optimal performance. You will work closely with data architects, analysts, and business stakeholders to support data integration and analytics initiatives. Key Responsibilities: Design, develop, test, and maintain ETL workflows using Informatica PowerCenter (or Informatica Intelligent Cloud Services). Build and optimize data pipelines and integration processes for data ingestion , transformation , and loading . Collaborate with data architects and business analysts to understand data requirements and build solutions accordingly. Work with relational databases (SQL Server, Oracle, etc.) to extract and load data using Informatica. Perform data profiling , data quality checks , and data validation . Optimize ETL jobs for performance, scalability, and fault tolerance. Support production ETL jobs , troubleshoot issues, and perform root cause analysis. Develop and maintain documentation for ETL processes and data flow. Assist in migrating legacy ETL processes to modern data platforms (if required).

Posted 2 weeks ago

Apply

3.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. At American Express, we know that with the right backing, people and businesses have the power to progress in incredible ways. Whether we’re supporting our customers’ financial confidence to move ahead, taking commerce to new heights, or encouraging people to explore the world, our colleagues are constantly redefining what’s possible — and we’re proud to back each other every step of the way. When you join #TeamAmex, you become part of a diverse community of over 60,000 colleagues, all with a common goal to deliver an exceptional customer experience every day. We back our colleagues with the support they need to thrive, professionally and personally. That’s why we have Amex Flex, our enterprise working model that provides greater flexibility to colleagues while ensuring we preserve the important aspects of our unique in-person culture. Depending on role and business needs, colleagues will either work onsite, in a hybrid model (combination of in-office and virtual days) or fully virtually. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. BU & LOB Description: Responsible for contacting clients with overdue accounts to secure the settlement of the account. Also they do preventive work to avoid future overdues with accounts that have a high exposure. Do you want to be part of a transformational journey at American Express, to realize the potential of our data assets to power the world’s best customer experience? The Risk Data team within Risk Products & Data strategy develops and maintains risk data to cater to strategies for Anti-Money Laundering(AML) in addition to managing the structured Risk Data from internal as well as external sources. The team is responsible for applying data engineering principles to data requirements with a strong focus on data governance and data quality. We lead the transformation of critical risk data process on the legacy platforms to modern platforms and integrate them into risk systems. Our team collaborates with internal capabilities stakeholders like Data Architects, Business Architects and partners to decide on the right data design/model We are looking for a strategic, experienced individual for the role of Manager, responsible for partnering across business units, enterprise technology teams, and product/platform teams. Develops deep understanding of business logic, intent and long term objectives to align priorities and develop long life solutions. Role & Responsibilities: The Manager in this position will have Anti-Money Laundering(AML) data ownership and governance. The incumbent would be leading a team of 3-4 data analysts and partner closely with credit strategy and compliance teams in building best in class data assets to cater for production use cases and analytical needs, while maintaining strong data governance and quality control framework. The candidate will be responsible for: End to end ownership of all data related to AML across platforms and processes Subject matter expert for the work stream he/she leads with in-depth knowledge of risk systems and processes. Thought leadership for the respective workstream he/she leads Ability to create a strategy and roadmap for the initiatives he/she is leading Responsible for ensuring data engineering principles are applied to each data requirement Risk data ownership for data used in our production use cases and analytics Define data quality controls (like detective, preventive, B&Cs) for batch and real time data & partners with tech teams for implementation Lead the transformation of critical risk data process on the POD platforms to Cornerstone and integrate them into risk systems Lead the development of capabilities to reduce manual operations and prevent operational risk Collaborate with internal capabilities stakeholders like Data Architects, Business Architects and partners to decide on the right data design/model Educate stakeholder community on the usage of risk data Reviewing and updating metadata and lineage standards and process guides published by the Enterprise Data Governance team Minimum Qualifications: A successful candidate will have: The position needs to have in depth knowledge of Amex Credit Risk Systems 3-4 years of hands on experience working on large size Capability or Analytical projects Credit/Fraud risk management experience and understanding of credit lifecycle is preferred Advanced Communicator Thought leadership and solution-oriented mindset Ability to think strategically and set POA direction Ability to build strong relationships in a cross-functional environment. Clear, effective written and oral communication skills Strong Analytical Skills Proficient in collaboration to drive results Deep understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) Preferred experience with Python, Hive, SQL, Hadoop Graduate degree in Computer Science, Mathematics, Statistics or Engineering Proven track record of driving results in a fast-paced environment often with significant ambiguity and needing to make decisions with less than perfect information. We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. With a focus on digitization, innovation, and analytics, the Enterprise Digital teams creates central, scalable platforms and customer experiences to help markets across all of these priorities. Charter is to drive scale for the business and accelerate innovation for both immediate impact as well as long-term transformation of our business. A unique aspect of Enterprise Digital Teams is the integration of diverse skills across all its remit. Enterprise Digital Teams has a very broad range of responsibilities, resulting in a broad range of initiatives around the world. The American Express Enterprise Digital Experimentation & Analytics (EDEA) leads the Enterprise Product Analytics and Experimentation charter for Brand & Performance Marketing and Digital Acquisition & Membership experiences as well as Enterprise Platforms. The focus of this collaborative team is to drive growth by enabling efficiencies in paid performance channels & evolve our digital experiences with actionable insights & analytics. The team specializes in using data around digital product usage to drive improvements in the acquisition customer experience to deliver higher satisfaction and business value. About this Role: This role will report to the Manager of Membership experience analytics team within Enterprise Digital Experimentation & Analytics (EDEA) and will be based in Gurgaon. The candidate will be responsible for delivery of highly impactful analytics to optimize our Digital Membership Experiences across Web & App channels. Deliver strategic analytics focused on Digital Membership experiences across Web & App aimed at optimizing our Customer experiences Define and build key KPIs to monitor the acquisition journey performance and success Support the development of new products and capabilities Deliver read out of experiments uncovering insights and learnings that can be utilized to further optimize the customer journey Gain deep functional understanding of the enterprise-wide product capabilities and associated platforms over time and ensure analytical insights are relevant and actionable Power in-depth strategic analysis and provide analytical and decision support by mining digital activity data along with AXP closed loop data Minimum Qualifications Advanced degree in a quantitative field (e.g. Finance, Engineering, Mathematics, Computer Science) Strong programming skills are preferred. Some experience with Big Data programming languages (Hive, Spark), Python, SQL. Experience in large data processing and handling, understanding in data science is a plus. Ability to work in a dynamic, cross-functional environment, with strong attention to detail. Excellent communication skills with the ability to engage, influence, and encourage partners to drive collaboration and alignment. Preferred Qualifications Strong analytical/conceptual thinking competence to solve unstructured and complex business problems and articulate key findings to senior leaders/partners in a succinct and concise manner. Basic knowledge of statistical techniques for experimentation & hypothesis testing, regression, t-test, chi-square test. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 2 weeks ago

Apply

1.0 - 6.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Data Engineer - Data Engineering & Analytics What you'll do: Create and maintain optimal data pipeline architecture. Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for scalability. Design, develop and deploy high volume ETL pipelines to manage complex and near-real time data collection. Develop and optimize SQL queries and stored procedures to meet business requirements. Design, implement, and maintain REST APIs for data interaction between systems. Ensure performance, security, and availability of databases. Handle common database procedures such as upgrade, backup, recovery, migration, etc. Collaborate with other team members and stakeholders. Prepare documentations and specifications. What you'll bring: Bachelor’s degree in computer science, Information Technology, or related field 1+ years of experience SQL, TSQL, Azure Data Factory or Synapse or relevant ETL technology. Prepare documentations and specifications. Strong analytical skills (impact/risk analysis, root cause analysis, etc.) Proven ability to work in a team environment, creating partnerships across multiple levels. Demonstrated drive for results, with appropriate attention to detail and commitment. Hands-on experience with Azure SQL Database Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 2 weeks ago

Apply

4.0 - 9.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s India Capability & Expertise Center (CEC) houses more than 60% of ZS people across three offices in New Delhi, Pune and Bengaluru. Our teams work with colleagues around the world to deliver real-world solutions to the clients who drive our business. The CEC maintains standards of analytical, operational and technologic excellence to deliver superior results to our clients. ZS’s Beyond Healthcare Analytics (BHCA) Team is shaping one of the key growth vectors for ZS. Beyond Healthcare engagements are comprised of clients from industries like Quick service restaurants, Technology, Food & Beverage, Hospitality, Travel, Insurance, Consumer Products Goods & other such industries across North America, Europe & South East Asia region. BHCA India team currently has presence across New Delhi, Pune and Bengaluru offices and is continuously expanding further at a great pace. BHCA India team works with colleagues across clients and geographies to create and deliver real world pragmatic solutions leveraging AI SaaS products & platforms, Generative AI applications, and other Advanced analytics solutions at scale. WhatYou’llDo Design and implement highly available data pipelines using spark and other big data technologies Work with data science team to develop new features to increase model accuracy and performance Create standardized data models to increase standardization across client deployments Troubleshooting and resolve issues in existing ETL pipelines. Complete proofs of concept to demonstrate capabilities and connect to new data sources Instill best practices for software development, ensure designs meet requirements, and deliver high-quality work on schedule. Document application changes and development updates. WhatYou’llBring A master’s or bachelor’s degree in computer science or related field from a top university. 4+ years' overall experience; 2+ years’ experience in data engineering using Apache Spark and SQL. 2+ years of experience in building and leading a strong data engineering team. Experience with full software lifecycle methodology, including coding standards, code reviews, source control management, build processes, testing, and operations. In-depth knowledge of python, sql, pyspark, distributed computing, analytical databases and other big data technologies. Strong knowledge of one or more cloud environments such as aws, gcp, and azure. Familiarity with the data science and machine learning development process Familiarity with orchestration tools such as Apache Airflow Strong analytical skills and the ability to develop processes and methodologies. Experience working with cross-functional teams, including UX, business (e.g. Marketing, Sales), product management and/or technology/IT/engineering) is a plus. Characteristics of a forward thinker and self-starter that thrives on new challenges and adapts quickly to learning new knowledge. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 2 weeks ago

Apply

4.0 - 9.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s India Capability & Expertise Center (CEC) houses more than 60% of ZS people across three offices in New Delhi, Pune and Bengaluru. Our teams work with colleagues around the world to deliver real-world solutions to the clients who drive our business. The CEC maintains standards of analytical, operational and technologic excellence to deliver superior results to our clients. ZS’s Beyond Healthcare Analytics (BHCA) Team is shaping one of the key growth vectors for ZS. Beyond Healthcare engagements are comprised of clients from industries like Quick service restaurants, Technology, Food & Beverage, Hospitality, Travel, Insurance, Consumer Products Goods & other such industries across North America, Europe & South East Asia region. BHCA India team currently has presence across New Delhi, Pune and Bengaluru offices and is continuously expanding further at a great pace. BHCA India team works with colleagues across clients and geographies to create and deliver real world pragmatic solutions leveraging AI SaaS products & platforms, Generative AI applications, and other Advanced analytics solutions at scale. WhatYou’llDo Design and implement highly available data pipelines using spark and other big data technologies Work with data science team to develop new features to increase model accuracy and performance Create standardized data models to increase standardization across client deployments Troubleshooting and resolve issues in existing ETL pipelines. Complete proofs of concept to demonstrate capabilities and connect to new data sources Instill best practices for software development, ensure designs meet requirements, and deliver high-quality work on schedule. Document application changes and development updates. WhatYou’llBring A master’s or bachelor’s degree in computer science or related field from a top university. 4+ years' overall experience; 2+ years’ experience in data engineering using Apache Spark and SQL. 2+ years of experience in building and leading a strong data engineering team. Experience with full software lifecycle methodology, including coding standards, code reviews, source control management, build processes, testing, and operations. In-depth knowledge of python, sql, pyspark, distributed computing, analytical databases and other big data technologies. Strong knowledge of one or more cloud environments such as aws, gcp, and azure. Familiarity with the data science and machine learning development process Familiarity with orchestration tools such as Apache Airflow Strong analytical skills and the ability to develop processes and methodologies. Experience working with cross-functional teams, including UX, business (e.g. Marketing, Sales), product management and/or technology/IT/engineering) is a plus. Characteristics of a forward thinker and self-starter that thrives on new challenges and adapts quickly to learning new knowledge. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

locationsBangalore, Indiaposted onPosted 30+ Days Ago job requisition id30517 FICO (NYSEFICO) is a leading global analytics software company, helping businesses in 100+ countries make better decisions. Join our world-class team today and fulfill your career potential! The Opportunity FICOs Product and Technology organization is seeking skillful and experienced Architectural Service Managers to successfully collaborate with various stakeholders to gather, understand, collate, and prioritize requirements. ASMs own and actively manage the backlog for their service grouping, and are experienced, skillful facilitators and cross-functional leaders. Senior Director, Architectural Service Management What Youll Contribute The ASM will be responsible for collecting and normalizing the requirements into engineering backlog and for meeting all stakeholder needs to the business priorities established as part of the monthly and quarterly cadence. Requirements Gathering and PrioritizationWork closely with stakeholders to gather and prioritize requirements for core Platform services. Define user stories, feature specifications, and acceptance criteria to ensure that the development team understands what needs to be built. Product Backlog Management : Own and manage the product backlog, including prioritizing features, refining user stories, and maintaining a balance between short-term and long-term goals. Ensure that the backlog reflects the most valuable and important items to be worked on by the development team as aligned to the business priorities. Collaboration with Development TeamWork closely with the development team throughout the development process, providing guidance, clarification, and feedback on product requirements. Collaborate on sprint planning, review work in progress, and accept completed features based on predefined acceptance criteria. Cross-Functional LeadershipCollaborate with cross-functional teams, including design, engineering, marketing, sales, and support, to ensure alignment and coordination across departments. Act as a bridge between technical and non-technical stakeholders, facilitating communication and ensuring that everyone is aligned with the product functionality and targets. Priority Management: Work closely with ASM Leads on assigning stories and tasks to resources and optimize productivity based on skillsets and availability. Productivity: Monitor completion of stories, generate reports, analyze individual, team productivity, and identify productivity issues, identify remedial actions, and work closely with Engineering Managers to take actions. Resource Management: Coordinate with ASM Leads and Engineering Managers on resource and skill set needs based on ASM requirements and target deliverables . Scrum Management: Facilitate planning and deliverables, support execution, optimize productivity with close monitoring of task completion, sequence of tasks and course correction as needed within Sprints. Clearly document and communicate requirements as Epics and Stories in standard, common format Prioritize and actively manage backlog to ensure right-sized and right priority items are understood and developed by engineering Create, manage, and communicate release schedules and scope to various stakeholders Act like an owner as the primary resource for the engineering teams to support them on dependency management, manage constraints, tradeoffs, and impediments to timely deliverables Create and maintain active and future development plans in well-structured Epic and Stories with good data quality and data-driven Sprint assignments Create, maintain, and communicate reports and summary analysis around priority management, productivity monitoring, resource management, and scrum management What Were Seeking Bachelor's degree in computer science or related field and/or equivalent experience Customer-driven requirements and technical acumen Curious thinking Clear and concise communication Bias for action Precise attention to detail Productive collaboration Effective facilitation Openness to receiving and giving growth-oriented and recognition feedback Feedback loop that informs continuous learning and improvement Effective product/service owner experience is required Our Offer to You An inclusive culture strongly reflecting our core valuesAct Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Why Make a Move to FICO At FICO, you can develop your career with a leading organization in one of the fastest-growing fields in technology today Big Data analytics. Youll play a part in our commitment to help businesses use data to improve every choice they make, using advances in artificial intelligence, machine learning, optimization, and much more. FICO makes a real difference in the way businesses operate worldwide Credit Scoring FICO Scores are used by 90 of the top 100 US lenders. Fraud Detection and Security 4 billion payment cards globally are protected by FICO fraud systems. Lending 3/4 of US mortgages are approved using the FICO Score. Global trends toward digital transformation have created tremendous demand for FICOs solutions, placing us among the worlds top 100 software companies by revenue. We help many of the worlds largest banks, insurers, retailers, telecommunications providers and other firms reach a new level of success. Our success is dependent on really talented people just like you who thrive on the collaboration and innovation thats nurtured by a diverse and inclusive environment. Well provide the support you need, while ensuring you have the freedom to develop your skills and grow your career. Join FICO and help change the way business thinks! Learn more about how you can fulfil your potential at FICO promotes a culture of inclusion and seeks to attract a diverse set of candidates for each job opportunity. We are an equal employment opportunity employer and were proud to offer employment and advancement opportunities to all candidates without regard to race, color, ancestry, religion, sex, national origin, pregnancy, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. Research has shown that women and candidates from underrepresented communities may not apply for an opportunity if they dont meet all stated qualifications. While our qualifications are clearly related to role success, each candidates profile is unique and strengths in certain skill and/or experience areas can be equally effective. If you believe you have many, but not necessarily all, of the stated qualifications we encourage you to apply. Information submitted with your application is subject to theFICO Privacy policy at

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities As an Associate Software Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems Preferred Education Master's Degree Required Technical And Professional Expertise Core Java, Spring Boot, Java2/EE, Microsservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) Spark Good to have Python Preferred Technical And Professional Experience None Show more Show less

Posted 2 weeks ago

Apply

3.0 - 6.0 years

25 - 33 Lacs

Bengaluru

Work from Office

Naukri logo

Overview Overview Annalect is currently seeking a Senior Data Engineer to join our Technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design and development of software products as well as research and evaluation of new technical solutions Responsibilities Designing, building, testing, and deploying data transfers across various cloud environments (Azure, GCP, AWS, Snowflake, etc). Developing data pipelines, monitoring, maintaining, and tuning. Write at-scale data transformations in SQL and Python. Perform code reviews and provide leadership and guidance to junior developers. Qualifications Curiosity in learning the business requirements that are driving the engineering requirements. Interest in new technologies and eagerness to bring those technologies and out of the box ideas to the team. 3+ years of SQL experience. 3+ years of professional Python experience. 3+ years of professional Linux experience. Preferred familiarity with Snowflake, AWS, GCP, Azure cloud environments. Intellectual curiosity and drive; self-starters will thrive in this position. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges. Additional Skills BS BS, MS or PhD in Computer Science, Engineering, or equivalent real-world experience. Experience with big data and/or infrastructure. Bonus for having experience in setting up Petabytes of data so they can be easily accessed. Understanding of data organization, ie partitioning, clustering, file sizes, file formats. Experience working with classical relational databases (Postgres, Mysql, MSSQL). Experience with Hadoop, Hive, Spark, Redshift, or other data processing tools (Lots of time will be spent building and optimizing transformations) Proven ability to independently execute projects from concept to implementation to launch and to maintain a live product. Perks of working at Annalect We have an incredibly fun, collaborative, and friendly environment, and often host social and learning activities such as game night, speaker series, and so much more! Halloween is a special day on our calendar since it is our Founding Day – we go all out with decorations, costumes, and prizes! Generous vacation policy. Paid time off (PTO) includes vacation days, personal days, and a Summer Friday program. Extended time off around the holiday season. Our office is closed between Xmas and New Year to encourage our hardworking employees to rest, recharge and celebrate the season with family and friends. As part of Omnicom, we have the backing and resources of a global billion-dollar company, but also have the flexibility and pace of a “startup” - we move fast, break things, and innovate. Work with modern stack and environment to keep on learning and improving helping to experiment and shape latest technologies

Posted 2 weeks ago

Apply

Exploring Hive Jobs in India

Hive is a popular data warehousing tool used for querying and managing large datasets in distributed storage. In India, the demand for professionals with expertise in Hive is on the rise, with many organizations looking to hire skilled individuals for various roles related to data processing and analysis.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi

These cities are known for their thriving tech industries and offer numerous opportunities for professionals looking to work with Hive.

Average Salary Range

The average salary range for Hive professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

Typically, a career in Hive progresses from roles such as Junior Developer or Data Analyst to Senior Developer, Tech Lead, and eventually Architect or Data Engineer. Continuous learning and hands-on experience with Hive are crucial for advancing in this field.

Related Skills

Apart from expertise in Hive, professionals in this field are often expected to have knowledge of SQL, Hadoop, data modeling, ETL processes, and data visualization tools like Tableau or Power BI.

Interview Questions

  • What is Hive and how does it differ from traditional databases? (basic)
  • Explain the difference between HiveQL and SQL. (medium)
  • How do you optimize Hive queries for better performance? (advanced)
  • What are the different types of tables supported in Hive? (basic)
  • Can you explain the concept of partitioning in Hive tables? (medium)
  • What is the significance of metastore in Hive? (basic)
  • How does Hive handle schema evolution? (advanced)
  • Explain the use of SerDe in Hive. (medium)
  • What are the various file formats supported by Hive? (basic)
  • How do you troubleshoot performance issues in Hive queries? (advanced)
  • Describe the process of joining tables in Hive. (medium)
  • What is dynamic partitioning in Hive and when is it used? (advanced)
  • How can you schedule jobs in Hive? (medium)
  • Discuss the differences between bucketing and partitioning in Hive. (advanced)
  • How do you handle null values in Hive? (basic)
  • Explain the role of the Hive execution engine in query processing. (medium)
  • Can you give an example of a complex Hive query you have written? (advanced)
  • What is the purpose of the Hive metastore? (basic)
  • How does Hive support ACID transactions? (medium)
  • Discuss the advantages and disadvantages of using Hive for data processing. (advanced)
  • How do you secure data in Hive? (medium)
  • What are the limitations of Hive? (basic)
  • Explain the concept of bucketing in Hive and when it is used. (medium)
  • How do you handle schema evolution in Hive? (advanced)
  • Discuss the role of Hive in the Hadoop ecosystem. (basic)

Closing Remark

As you explore job opportunities in the field of Hive in India, remember to showcase your expertise and passion for data processing and analysis. Prepare well for interviews by honing your skills and staying updated with the latest trends in the industry. Best of luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies