Jobs
Interviews

10949 Apache Jobs - Page 45

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence, and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* The Data Analytics Strategy platform and decision tool team is responsible for Data strategy for entire CSWT and development of platforms which supports the Data Strategy. Data Science platform, Graph Data Platform, Enterprise Events Hub are key platforms of Data Platform initiative. Job Description* As a Senior Hadoop Developer to develop Hadoop components in SDP (strategic data platform), individual will be responsible for understanding design, propose high level and detailed design solutions, and ensure that coding practices/quality comply with software development standards. Working as an individual contributor in projects, person should have good analytical skills to take a quick decision during the tough times, Person should have good knowledge writing complex queries in a larger cluster. Engage in discussions with architecture teams for coming out with design solutions, proposing new technology adoption ideas, attending project meetings, partnering with near shore and offshore teammates in an agile environment, coordinating with other application teams, development, testing, upstream and downstream partners, etc. Responsibilities: Develop high-performance and scalable Analytics solutions using the Big Data platform to facilitate the collection, storage, and analysis of massive data sets from multiple channels. Develop efficient utilities, data pipelines, ingestion frameworks that can be utilized across multiple business areas. Utilize your in-depth knowledge of Hadoop stack and storage technologies, including HDFS, Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Hue, and Oozie, to design and optimize data processing workflows. Data analysis, coding, Performance Tunning, propose improvement ideas, drive the development activities at offshore. Analyze complex Hive Queries, able to modify Hive queries, tune Hive Queries Hands on experiences writing scripts in python/shell scripts and modify scripts. Provide guidance and mentorship to junior teammates. Work with the strategic partners to understand the requirements work on high level & detailed design to address the real time issues in production. Partnering with near shore and offshore teammates in Agile environment, coordinating with other application teams, development, testing, up/down stream partners, etc. Hands on experiences writing scripts in python/shell scripts and modify scripts. Work on multiple projects concurrently, take ownership & pride in the work done by them, attending project meetings, understanding requirements, designing solutions, developing code. Identify gaps in technology and propose viable solutions. Identify improvement areas within the application and work with the respective teams to implement the same. Ensuring adherence to defined process & quality standards, best practices, high quality levels in all deliverables. Desired Skills* Data Lake Architecture: Understanding of Medallion architecture ingestion Frameworks: Knowledge of ingestion frameworks like structured, unstructured, and semi structured Data Warehouse: Familiarity with Apache Hive and Impala Performs Continuous Integration and Continuous Development (CI-CD) activities. Hands on experience working in a Cloudera data platform (CDP) to support the Data Science Contributes to story refinement and definition of requirements. Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle. Extensive hands-on supporting platforms to allow modelling and analysts go through the complete model lifecycle management (data munging, model develop/train, governance, deployment) Experience with model deployment, scoring and monitoring for batch and real-time on various technologies and platforms. Experience in Hadoop cluster and integration includes ETL, streaming and API styles of integration. Experience in automation for deployment using Ansible Playbooks, scripting. Experience with developing and building RESTful API services in an efficient and scalable manner. Design and build and deploy streaming and batch data pipelines capable of processing and storing large datasets quickly and reliably using Kafka, Spark and YARN for large volumes of data (TBs) Experience with processing and deployment technologies such YARN, Kubernetes /Containers and Serverless Compute for model development and training. Effective communication, Strong stakeholder engagement skills, Proven ability in leading and mentoring a team of software engineers in a dynamic environment. Requirements* Education* Graduation / Post Graduation Experience Range* 7 to 9 years Foundational Skills Hadoop, Hive, Sqoop, Impala, Unix/Linux scripts. Desired Skills Python, CI/CD, ETL. Work Timings* 11:30 AM to 8:30 PM IST Job Location* Chennai

Posted 1 week ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Job Title : Java Architect Experience : 12 to 20 Years Location : Chennai Notice Period : Immediate Joiners Preferred / Max 30 Days' Notice Job Description We are seeking an experienced Java Architect to join our team in Chennai. The ideal candidate will have a strong technical foundation in enterprise Java architecture, modern cloud platforms, and BFSI domain experience. This role requires on-site presence (no hybrid/remote mode) and demands strong leadership in designing scalable and secure enterprise solutions. Key Responsibilities : Define, design, and deliver end-to-end architectural solutions for enterprise-level applications. Guide development teams in applying best practices and architectural standards. Collaborate with stakeholders to translate business requirements into technical specifications. Lead API integration, Microservices architecture, and streaming solutions. Ensure cloud-native architecture and migration strategies using OpenShift, PCF, AWS, Azure, or Oracle Cloud. Review and optimize performance, scalability, and security of applications. Mentor and guide junior architects and senior developers. Required Skills and Experience : Total Experience : 12+ years in Java-based enterprise application development. Architecture Experience : Minimum 2 years as Java Architect or in an architecture-focused role. BFSI Domain : Minimum 2 years of hands-on experience in BFSI (Banking, Financial Services & Insurance) projects. Cloud Exposure : At least 3 years of experience working with: OpenShift , PCF (Pivotal Cloud Foundry) AWS , Azure , or Oracle Cloud Strong hands-on expertise in: Java/J2EE , Spring/Spring Boot Microservices Architecture API Design & Integration (REST/SOAP) Streaming Technologies (Kafka, Apache Flink, etc.) Knowledge of CI/CD pipelines, containerization (Docker/Kubernetes), and DevOps practices is a plus.

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Hiring for one of our MNC Client (Full-Time): Interview Mode : F2F Location : Noida NP : Immediate joiner Inviting applications for the role of Lead Consultant – Java Full Stack In this role, you will be responsible for Developing Microsoft Access Databases, including tables, queries, forms and reports, using standard IT processes, with data normalization and referential integrity. Responsibilities Experience with Spring Boot Experience with Microservices development Extensive Experience working with JAVA Rest API. Extensive experience in Java 8-17 SE Experience with unit testing frameworks Junit or Mockito Experience with Maven/Gradle Experience in Angular 16+ & Rxjs. NgRx is mandatory along with experience with unit testing using Jest/Jasmine Professional, precise communication skills Experience in API designing, troubleshooting, and tuning for performance Professional, precise communication skills Experience in designing, troubleshooting, API Java services and microservices Experience in any CI/CD tool. Experience in Apache Kafka will be added advantage. Qualifications we seek in you! Minimum Qualifications BE /B.Tech/M.Tech/MCA Excellent Communication Skills Good Team Player Preferred qualifications Experience with Spring Boot Experience with Microservices development Extensive Experience working with JAVA Rest API. Extensive experience in Java 8-17 SE Experience with unit testing frameworks Junit or Mockito Experience with Maven/Gradle Experience in Angular 13+ & Rxjs. Ngrx will be an added advantage

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana

Remote

About the Role: Grade Level (for internal use): 09 S&P Global Commodity Insights The Role: Engineer II, Application Support Analyst, The Location: Hyderabad/Gurgaon, India The Team: AppOps is responsible for providing high quality operational and technical support for all Commodity Insights (CI) business specific applications and systems. Responsible to provide CI Business Partners with initial first line remote support for IT issues and requests which occur during business hours in relation to the use of CI business specific applications. Ensuring that standard operating procedures is followed for all incident and service requests received into the helpdesk function. Proactively monitor applications responding to alerts and providing the business with periodic health check reports. We operate 24x7 which can involve working during APAC|EMEA|AMER Hours & requires weekend support. (Rotational shifts 5 day a week). Work hours can change depending on Business requirements. Enter the grade level of the position: Grade 9 The Impact: You will be the first line of support for all requests and incidents raised by Commodity Insights business partners. You will ensure the business receives a prompt response to any requests and ensure issues are resolved within agree service level agreements What’s in it for you: The position is the part of the global application Support team supporting users based in three time zones and across 26 offices. Exposure to Application /Product support, technical operations, monitoring and projects in a role where you will interact directly with the business and learn the products and systems required to support the Platts business operations. Responsibilities: Provide initial first line Application/Product support and triage of incidents and service requests for IT issues which occur during use of Platts applications. Technical Excellence: In-depth Technical understanding of all Applications, Monitoring Tools, and all available technical resources. Executing Effective Weekend Support Incident Identification, Effective Shift handovers, Major Incident Mgmt. & Process Hygiene. Log and capture incidents from all sources into ticketing system (ServiceNow) ensuring correct categorization and prioritization of IT issues Application Support Operations: Ensure application operations excellence and guaranteed response times by actively monitoring application health checks, end user emails/tickets and ensuring all Incidents/service requests are resolved in a timely and comprehensive manner. Server maintenance, monitoring, health checks, restarts, and BAU operational work. Provide 24 x 7 round the clock support to Platts business partners utilizing shift patterns Major Incident Management: Engaging & driving the major Incidents during the weekends to Initiate bridge call, engage technical teams and restore the service Immediately Incident Hygiene: Adhering to the Incident Hygiene process, ensuring High Hygiene in the Incidents & requests handled. Knowledge Management and competency development: Create & share the SOPs, Best Practice documents, check list, technical knowledge articles. Resolving IT incidents to restore service as quickly as possible using known error database. Escalation of tickets to other teams as required Active participation in knowledge transitions, also coming up with Process Initiatives, deliver ideas and values to achieve the desired results. What We’re Looking For: Basic Qualifications: Experience working with various Application Monitoring systems and tools (Autosys / AppDynamics /Nagios/Naemon/Splunk preferred) Experience in IT Service Management frameworks (ITIL or similar) Knowledge of troubleshooting & supporting applications running on either Linux (preferred) or Windows server OS Exposure to industry standard ITSM tools (ServiceNow strongly preferred) Experience supporting Cloud computing (AWS). Familiar with infrastructure concepts related to distributed applications (Load balancers, Networking. Firewall, NAT, Virtual servers) Exposure working with tools like Putty, RDP, SSH, WinSCP, MySQL Query Browser, Oracle SQL Developer. Familiar with reporting and analyzing tools (Beneficial but not essential) Experience working collaborative platforms like Microsoft SharePoint, Box, OneDrive, MS Teams. Good understanding of Agile Framework. Any knowledge of Webservers either (Beneficial but not essential) Windows IIS Linux Apache, and WebLogic (preferred) Any knowledge of scripting languages (JScript and JavaScript DOS, VBScript, Pearl, Python, PowerShell, or shell script) preferred (Beneficial but not essential) Microsoft Office / Office 365 especially Excel (Macros, Worksheets, and add-ins) Preferred Qualifications: 5+ years of relevant experience with bachelor’s degree. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. We’re a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights’ coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 317512 Posted On: 2025-07-26 Location: Gurgaon, Haryana, India

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana

Remote

Engineer II AppOps Gurgaon, India; Hyderabad, India; Penang-Jalan, Malaysia Information Technology 317512 Job Description About The Role: Grade Level (for internal use): 09 S&P Global Commodity Insights The Role: Engineer II, Application Support Analyst, The Location: Hyderabad/Gurgaon, India The Team: AppOps is responsible for providing high quality operational and technical support for all Commodity Insights (CI) business specific applications and systems. Responsible to provide CI Business Partners with initial first line remote support for IT issues and requests which occur during business hours in relation to the use of CI business specific applications. Ensuring that standard operating procedures is followed for all incident and service requests received into the helpdesk function. Proactively monitor applications responding to alerts and providing the business with periodic health check reports. We operate 24x7 which can involve working during APAC|EMEA|AMER Hours & requires weekend support. (Rotational shifts 5 day a week). Work hours can change depending on Business requirements. Enter the grade level of the position: Grade 9 The Impact: You will be the first line of support for all requests and incidents raised by Commodity Insights business partners. You will ensure the business receives a prompt response to any requests and ensure issues are resolved within agree service level agreements What’s in it for you: The position is the part of the global application Support team supporting users based in three time zones and across 26 offices. Exposure to Application /Product support, technical operations, monitoring and projects in a role where you will interact directly with the business and learn the products and systems required to support the Platts business operations. Responsibilities: Provide initial first line Application/Product support and triage of incidents and service requests for IT issues which occur during use of Platts applications. Technical Excellence: In-depth Technical understanding of all Applications, Monitoring Tools, and all available technical resources. Executing Effective Weekend Support Incident Identification, Effective Shift handovers, Major Incident Mgmt. & Process Hygiene. Log and capture incidents from all sources into ticketing system (ServiceNow) ensuring correct categorization and prioritization of IT issues Application Support Operations: Ensure application operations excellence and guaranteed response times by actively monitoring application health checks, end user emails/tickets and ensuring all Incidents/service requests are resolved in a timely and comprehensive manner. Server maintenance, monitoring, health checks, restarts, and BAU operational work. Provide 24 x 7 round the clock support to Platts business partners utilizing shift patterns Major Incident Management: Engaging & driving the major Incidents during the weekends to Initiate bridge call, engage technical teams and restore the service Immediately Incident Hygiene: Adhering to the Incident Hygiene process, ensuring High Hygiene in the Incidents & requests handled. Knowledge Management and competency development: Create & share the SOPs, Best Practice documents, check list, technical knowledge articles. Resolving IT incidents to restore service as quickly as possible using known error database. Escalation of tickets to other teams as required Active participation in knowledge transitions, also coming up with Process Initiatives, deliver ideas and values to achieve the desired results. What We’re Looking For: Basic Qualifications: Experience working with various Application Monitoring systems and tools (Autosys / AppDynamics /Nagios/Naemon/Splunk preferred) Experience in IT Service Management frameworks (ITIL or similar) Knowledge of troubleshooting & supporting applications running on either Linux (preferred) or Windows server OS Exposure to industry standard ITSM tools (ServiceNow strongly preferred) Experience supporting Cloud computing (AWS). Familiar with infrastructure concepts related to distributed applications (Load balancers, Networking. Firewall, NAT, Virtual servers) Exposure working with tools like Putty, RDP, SSH, WinSCP, MySQL Query Browser, Oracle SQL Developer. Familiar with reporting and analyzing tools (Beneficial but not essential) Experience working collaborative platforms like Microsoft SharePoint, Box, OneDrive, MS Teams. Good understanding of Agile Framework. Any knowledge of Webservers either (Beneficial but not essential) Windows IIS Linux Apache, and WebLogic (preferred) Any knowledge of scripting languages (JScript and JavaScript DOS, VBScript, Pearl, Python, PowerShell, or shell script) preferred (Beneficial but not essential) Microsoft Office / Office 365 especially Excel (Macros, Worksheets, and add-ins) Preferred Qualifications: 5+ years of relevant experience with bachelor’s degree. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. We’re a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights’ coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 317512 Posted On: 2025-07-26 Location: Gurgaon, Haryana, India

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Experience: 1-3 year Location: Bangalore/Pune/Hyderabad Work mode: Hybrid Founded in the year 2017, CoffeeBeans specializes in offering high end consulting services in technology, product, and processes. We help our clients attain significant improvement in quality of delivery through impactful product launches, process simplification, and help build competencies that drive business outcomes across industries. The company uses new-age technologies to help its clients build superior products and realize better customer value. We also offer data-driven solutions and AI-based products for businesses operating in a wide range of product categories and service domains As a Data Engineer, you will play a crucial role in designing and optimizing data solutions for our clients. The ideal candidate will have a strong foundation in Python, experience with Databricks Warehouse SQL or a similar Spark-based SQL platform, and a deep understanding of performance optimization techniques in the data engineering landscape. Knowledge of serverless approaches, Spark Streaming, Structured Streaming, Delta Live Tables, and related technologies is essential. What are we looking for? Bachelor's degree in Computer Science, Engineering, or a related field. 1-3 year of experience as a Data Engineer. Proven track record of designing and optimizing data solutions. Strong problem-solving and analytical skills. Must haves Python: Proficiency in Python for data engineering tasks and scripting. Performance Optimization: Deep understanding and practical experience in optimizing data engineering performance. Serverless Approaches: Familiarity with serverless approaches in data engineering solutions. Good to have Databricks Warehouse SQL or Equivalent Spark SQL Platform: Hands-on experience with Databricks Warehouse SQL or a similar Spark-based SQL platform. Spark Streaming: Experience with Spark Streaming for real-time data processing. Structured Streaming: Familiarity with Structured Streaming in Apache Spark. Delta Live Tables: Knowledge and practical experience with Delta Live Tables or similar technologies. What will you be doing? Design, develop, and maintain scalable and efficient data solutions. Collaborate with clients to understand data requirements and provide tailored solutions. Implement performance optimization techniques in the data engineering landscape. Work with serverless approaches to enhance scalability and flexibility. Utilize Spark Streaming and Structured Streaming for real-time data processing. Implement and manage Delta Live Tables for efficient change data capture.

Posted 1 week ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

High proficiency in Java Development Frameworks - Proficiency in Spring Boot, Microservices, Apache Camel Spring Framework, React, JQuery. Messaging – Proficiency in RabbitMQ/Other messaging frameworks like Kafka, Active MQ E-Commerce Web application development Good understanding of Design Patterns Third-party API integration experience A day in the life of an Infosys Equinox employee: As part of the Infosys Equinox delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Infosys Equinox is a human-centric digital commerce platform that helps brands provide an omnichannel and memorable shopping experience to their customers. With a future-ready architecture and integrated commerce ecosystem, Infosys Equinox provides an end-to-end commerce platform covering all facets of an enterprise’s e-commerce needs. Communication and Good Problem-solving skills. Good understanding of the functional capabilities of the e-commerce platform. Exposure to a variety of front-end, middleware, and back-end technologies Capable of contributing code to any area in the project’s technical stack. Understanding coding practices, code quality, and code coverage

Posted 1 week ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 4-6 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description Strong proficiency in Python and at least one Python web framework (e.g., Django, Flask) Programming Languages : Proficiency in Python Expertise in SQL and database management (PostgreSQL, MySQL) Knowledge of NoSQL databases, particularly Cassandra Familiarity with message brokers, especially Apache Kafka Proficiency in Linux operating system Experience with containerization using Docker Understanding of container orchestration with Kubernetes Version control with Git Software Development : Experience with software development methodologies and best practices. Data Structures and Algorithms : Strong understanding and practical application5 Object-Oriented Design (OOD) : Ability to apply OOD principles for flexible and modular software. Database Management : Knowledge of SQL and experience with both relational and NoSQL databases. Version Control : Proficiency with Git and experience managing complex branching strategies Testing and Debugging : Expertise in software testing methodologies and debugging techniques. API Development : Experience in designing and implementing RESTful APIs DevOps Practices : Familiarity with CI/CD pipelines and cloud platforms (e.g., AWS, Azure) Experience with test-driven development and automated testing frameworks (e.g., Pytest) Experience with Spring Boot framework for Java applications and Java, JavaScript would be nice to have. (ref:hirist.tech)

Posted 1 week ago

Apply

0.0 - 4.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have knowledge in web (PHP) development with PHP, Cake PHP, MySQL, jQuery, JavaScript, AJAX, Linux, JSON, and XML. In the back-end, you should be proficient in MVC/Object-Oriented PHP (v.5+), web scraping using Regular Expressions and XPATH, developing complex data-driven systems without a framework, secure e-commerce crawler development, adaptive problem-solving skills, performance optimization techniques, and debugging for issue diagnosis. Unit testing your code with assertions is essential along with hands-on experience with Linux/UNIX, RESTful paradigms, Apache, and MySQL database efficiency analysis. For front-end development, expertise in jQuery/JavaScript, scripting languages like Ajax and DOM manipulation, and a strong grasp of HTML5 and CSS3 is required. Regarding the database, you should have a good command of MySQL (v.5+) supported by phpMyAdmin, understanding of design patterns, PHP best practices, PHP Frameworks like Cake PHP, scalability architecture, server performance considerations, and push notification system implementation. The ideal candidate should hold a BS/MS degree in Computer Science or Engineering, possess development skills in PHP, MySQL, jQuery, JavaScript, AJAX, Linux, JSON, and XML, have knowledge of relational databases, version control tools, web services development, API development using PHP, a passion for sound design and coding practices, strong verbal and written communication skills, problem-solving attitude, and consistency in work ethics. Must-have skills include PHP, Core PHP, MySQL, Curl, Linux commands, API development using PHP, scripting for process automation, web crawling, and being consistent and dependable. Interested candidates can share their resumes at vaishali@mastheadtechnologies.com. This is a full-time position with a day shift schedule, requiring in-person work at the designated location. The application deadline is 13/07/2025, and the expected start date is 11/07/2025.,

Posted 1 week ago

Apply

9.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Engineer (JAVA) at Home Loan Savings in Pune, India, you will play a crucial role in the deployment, configuration, and development of software within the development team. Your responsibilities will involve collaborating with production and operation units, as well as business areas to support the success of the growing domain Home Loan Savings at Deutsche Bank. You will be utilizing Continuous Integration tools to contribute to the digitalization journey of the bank. Within the Deployment Specialist/System Integrator role, you will be engaged in the entire Software Deployment and Integration Lifecycle. This includes analyzing infrastructure requirements, deploying and testing software, as well as maintaining, improving, and monitoring systems. You will focus on configuration and change management, health monitoring, performance analysis, and reporting. Additionally, you will manage DR replication for rapid system recovery during outage events. Your expertise in x86 and Google Platform technologies, Apache, Tomcat, Oracle database integration, Midrange-Infrastructure, and agile development frameworks will be essential. With at least 9 years of experience in software development, you will deploy applications in midrange and cloud environments, work with batch processing and job scheduling systems, and troubleshoot application performance effectively. Your proactive approach, strong communication skills, and ability to work collaboratively will be valuable assets in achieving sprint objectives and supporting the team. As part of the benefits package, you can enjoy a best-in-class leave policy, gender-neutral parental leaves, childcare assistance benefits, sponsorship for relevant certifications, Employee Assistance Program, comprehensive insurance coverage, and health screenings. Training, coaching, and a culture of continuous learning are provided to support your career growth and progression within the team. Deutsche Bank values empowerment, responsibility, commercial thinking, initiative, and collaboration. The company promotes a positive, fair, and inclusive work environment where successes are shared and celebrated. Applications are welcome from all individuals who strive to excel together in a diverse and supportive workplace environment.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

bhopal, madhya pradesh

On-site

You will be working at AlignTogether Solutions, a company founded in 2016, that has established itself as a leader in offering a broad range of services including strategy, consulting, interactive technology, and operations, all seamlessly integrated with digital capabilities. The company is also excited to present its innovative product, Aligntogether.live, which transforms the world of live interactive services through an inclusive business management software solution. As a Java Spring Boot Developer with expertise in Azure, Kubernetes, Docker, and Microservices Architecture, you will be responsible for the design, development, and maintenance of robust and scalable web applications. Your role will involve collaborating with the team to create software solutions that align with project requirements, utilizing best practices in Microservices Architecture and cloud-native technologies. Key responsibilities will include software development, where you will create high-quality Java Spring Boot applications using Azure, Kubernetes, Docker, and Microservices Architecture to deliver scalable and cloud-native solutions. You will also engage in architecture and design activities, participate in code reviews to ensure quality and adherence to standards, conduct testing to identify and resolve issues, and maintain technical documentation for developed applications and components. Additionally, you will work with DevOps teams to integrate applications into CI/CD pipelines, collaborate with cross-functional teams to deliver high-quality software, troubleshoot technical issues related to Microservices, Azure, Kubernetes, and Docker, and possess familiarity with industry-standard protocols related to API security such as OAuth. Ideal qualifications for this role include a Bachelor's degree in Computer Science or a related field, proven experience in Java Spring Boot development, containerization, orchestration, Azure, and Microservices Architecture, familiarity with databases, front-end technologies, version control systems, and collaborative development tools. Strong problem-solving, debugging, communication, and teamwork skills are essential for success in this position. Additionally, working experience with AngularJS or ReactJS, Redis and Memcached cache management, JBOSS 7 and above, API Gateway, Apache, EJB, and Azure would be advantageous.,

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Work Location : Hyderabad What Gramener offers you Gramener will offer you an inviting workplace, talented colleagues from diverse backgrounds, career path, steady growth prospects with great scope to innovate. Our goal is to create an ecosystem of easily configurable data applications focused on storytelling for public and private use Cloud Lead – Analytics & Data Products We’re looking for a Cloud Architect/Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning, application/API hosting, and enabling data and GenAI workloads through a modern, secure cloud environment. Roles and Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins, AWS CodePipeline, CodeBuild, or GitHub Actions. Deploy and host internal tools, APIs, and applications using ECS, EKS, Lambda, API Gateway, and ELB. Provision and support analytics and data platforms using S3, Glue, Redshift, Athena, Lake Formation, and orchestration tools like Step Functions or Apache Airflow (MWAA). Implement cloud security, networking, and compliance using IAM, VPC, KMS, CloudWatch, CloudTrail, and AWS Config. Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock, SageMaker, or integrations with APIs like OpenAI. Skills And Qualifications 7-10 years of experience in cloud engineering, DevOps, or cloud architecture roles. Hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python, Bash) and infrastructure automation. Experience deploying containerized workloads using Docker, ECS, EKS, or Fargate. Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect, DevOps Engineer) are preferred. About Us We help consult and deliver solutions to organizations where data is at the core of decision making. We undertake strategic data consulting for organizations in laying out the roadmap for data driven decision making, in order to equip organizations to convert data into a strategic differentiator. Through a host of our product and service offerings we analyse and visualize large amounts of data. To know more about us visit Gramener Website and Gramener Blog. Apply for this role Apply for this Role

Posted 1 week ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Overview Would you like to help enrich the lives of learners around the world? RM India (RM Education Solutions India Private Limited) is the India Delivery Center for UK based RM Plc. A leading supplier of technology and resources to the education sector, RM India helps deliver great education products and services that help teachers to teach and learners to learn. Our mission is to achieve growth by improving life chances of people. At RM India, we are driven by the potential of our business to touch lives and shape the future. RM Plc have been pioneers of education technology since 1973. We provide technology and resources to the education sector supporting over 10 million students around the world. We work with 28,000 schools, nurseries, and education trusts in 115 countries to deliver customer-centric solutions that improve education outcomes worldwide. RM is a leading supplier of technology and resources to the education sector, supporting schools, teachers, and pupils across the globe. What we do helps learners at all stages of their lives, from preschool to higher education and professional qualification, we partner with schools, examination boards, central governments and other professional institutions, to enrich the lives of learners. RM Group operates through three businesses: Technology (Managed Services, Software and Infrastructure for Schools), Assessment (Software and Services) and TTS (Education Resources). Visit us here to find out more: www.rmindia.co.in Responsibilities Application Support (Must have) Extensive knowledge in troubleshooting web applications hosted in IIS or Apache. Should be able to replicate the issues raised by customers with available information. Deep dive into the issue to find RCA in given SLAs. Troubleshot both functional issues as well as performance issues in the applications. Proactively analyze the events logs and prevent any potential issues from happening. Database - MS SQL / Postgre SQL(Must have) Expert knowledge in writing complex sql queries in ms sql server or postgresql. Should be able to troubleshoot complex stored procedures, functions etc. Troubleshoot performance issues in DB server. etc. Create custom sql queries to work around issues, bulk update data, purge data etc. Monitoring – Azure Monitor, Cloud watch, Grafana, Ops genie (Must have) Should be acknowledging alerts triggered from various monitoring solutions and resolve them. Knowledge in creating or optimizing alerts is good to have. Also analyze logs from Azure Application Insights or tools like sumologic. Ticketing tools – ServiceNow / Jira (Must have) Experience in ticket management. Create, update and triage tickets. Maintain ticket SLAs. Cloud – Azure / AWS (Desired) Hands on experience is maintaining/troubleshooting azure/aws services. Windows/Linux VM basic level administration such as upscale/downscale, start/stop, ssh, troubleshoot logs, check disk spaces etc. Basic administration of Azure SQL or Postgres RDS clusters, performance monitoring, troubleshooting. Maintaining secrets. Storage account/S3 management activities. Basics of IAM administration. Troubleshoot issues of applications hosted in AKS/ECS clusters. Service bus queue troubleshooting. Deployment – Azure Devops / Gitlab (Good to have) Deploying applications using existing deployment pipelines. Troubleshoot deployment failures. Scripting – Power shell / Shell (Good to have) Knowledge in writing scripts to automate tasks, setup workarounds. Experience Experience:- 2+ yrs Mandatory skillset: - Application Support,Azure cloud, SQL/PostgreSQL,Infra maintanence, Azure/Aws, L3 support What's in it for you? What’s in it for you? At RM My Work Blend @RM provides office-based colleagues with multi location and hybrid working options to suit them. As well as your office base, you can spend a proportion of your time working at alternative locations, and with flexibility of hours, as appropriate to the role. We encourage you to discuss arrangements for this role with your potential line manager during the recruitment process. We expect how we make best use of hybrid working may continue to adapt as we adjust to our new ways of working. As well as a competitive salary our core benefits package includes Group Health Insurance, Group Personal Accident, Group Term Life Insurance, Doctor consultation reimbursement, annually, Medical reimbursement, monthly/Qtry/Annual Rewards & Recognition program, Annual Salary Review & Bonus payouts, Children’s Education Assistance Subsidy, Summer Vacation Scheme, Staff Children Engagement Programme, Knowledge Acquisition Subsidy, Transportation Subsidy, Birthday & Marriage gifts and Subsidized Technopark Club Membership. RM India also has a comprehensive Rewards & recognition program to recognize and reward employees. You could even earn yourself an extra bonus for successfully recommending a friend or family member for a position within RM. To better reflect the society that we serve, we’re committed to building a diverse workforce and creating an inclusive and welcoming environment for all. To achieve this, we create teams of talented people from different backgrounds and experiences and strive to be a business where our people can bring their whole selves to work, we also want to make the recruitment process as inclusive as possible for everyone. Should you require additional support with your application or through the interview process, please contact us at talent@in.rm.com

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With a workforce of over 125,000 individuals across 30+ countries, we are fueled by curiosity, agility, and the ambition to create enduring value for our clients. Our purpose, driven by the relentless pursuit of a world that works better for people, enables us to serve and transform leading enterprises, including the Fortune Global 500. We leverage our deep business and industry knowledge along with expertise in digital operations services, data, technology, and AI. We are currently seeking applications for the position of Principal Consultant, Frontend Developer. As a member of our team, your responsibilities will include designing, customizing, and implementing solutions for wholesale brokerage platforms within our organization. You will utilize your technical expertise to address identified requirements and specifications, resolving production issues and enhancing business functionality promptly. Additionally, you will be expected to review and analyze requirements effectively and possess proven experience as a Frontend Developer or in a similar role. Familiarity with common stacks is essential for success in this role. Qualifications we are looking for in a candidate include a Bachelor's degree in information technology or computer science, alongside professional development experience. Preferred qualifications/skills involve proven experience with MuleSoft and REST API, proficiency in Apex, Visual Force, JavaScript, and CSS, as well as mandatory knowledge of TypeScript/JavaScript frameworks such as Angular, React, and Node.js. Knowledge of multiple front-end languages and libraries, cloud platforms like GCP & Terraform, databases (e.g., PostgreSQL, MySQL), web servers (e.g., Apache), UI/UX design, Application Framework like Node Express & Visual Studio Code, excellent communication skills, good understanding of AWS data analytic tools, and hands-on experience with deployment and version control tools like Copado, GIT, etc. This position is based in India, specifically Noida, and is a full-time role. The ideal candidate will hold a Bachelor's degree or equivalent qualification. The job was posted on Apr 29, 2025, at 8:36:05 AM, and the unposting date is set for Oct 26, 2025, at 4:36:05 AM. The primary skill set required includes consulting, and the job category is listed as Full Time.,

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

TransUnion's Job Applicant Privacy Notice What We'll Bring TransUnion is a global information and insights company that makes trust possible in the modern economy. We do this by providing a comprehensive picture of each person so they can be reliably and safely represented in the marketplace. As a result, businesses and consumers can transact with confidence and achieve great things. We call this Information for Good.® A leading presence in more than 30 countries across five continents, TransUnion provides solutions that help create economic opportunity, great experiences and personal empowerment for hundreds of millions of people. What You'll Bring As consultant on our team, you will join a global group of statisticians, data scientists, and industry experts on a mission to extract insights from data and put them to good use. You will have an opportunity to be a part of a variety of analytical projects in a collaborative environment and be recognized for the work you deliver. TransUnion offers a culture of lifelong learning and as an associate here, your growth potential is limitless. The consultant role within the Research and Consulting team is responsible for delivering market-level business intelligence both to TransUnion’s senior management and to Financial Services customers. You will work on projects across international markets, including Canada, Hong Kong, UK, South Africa, Philippines, and Colombia. To be successful in this position, you must have good organizational skills, a strategic mindset, and a flexible predisposition. You will also be expected to operate independently and able to lead and present projects with minimal supervision. How You’ll Contribute You will develop a strong understanding of consumer credit data and how it applies to industry trends and research across different international markets You will dig in by extracting data and performing segmentation and statistical analyses on large population datasets (using languages such as R, SQL, and Python on Linux and PC computing platforms) You will conduct analyses and quantitative research studies designed to understand complex industry trends and dynamics, leveraging a variety of statistical techniques You will deliver analytic insights and recommendations in succinct and compelling presentations for internal and external customers at various levels including an executive audience; you may lead key presentations to clients You will perform multiple tasks simultaneously and deal with changing requirements and deadlines You will develop strong consulting skills to be able to help external customers by understanding their business needs and aligning them with TransUnion’s product offerings and capabilities You will help to cultivate an environment that promotes excellence, innovation, and a collegial spirit Through all these efforts, you will be a key contributor to driving the perception of TransUnion as an authority on lending dynamics and a worthwhile, trusted partner to our clients and prospects Impact You'll Make What you'll bring: A Bachelor’s or Master’s degree in Statistics, Applied Mathematics, Operations Research, Economics, or an equivalent discipline Minimum 3-5 years of experience in a relevant field, such as data analytics, lending, or risk strategy Advanced proficiency with one or more statistical programming languages such as R Advanced proficiency writing SQL queries for data extraction Experience with big data platforms (e.g. Apache Hadoop, Apache Spark) preferred Advanced experience with the MS Office suite, particularly Word, Excel, and PowerPoint Strong time management skills with the ability to prioritize and contribute to multiple assignments simultaneously Excellent verbal and written communication skills. You must be able to clearly articulate ideas to both technical and non-technical audiences Highly analytical mindset with the curiosity to dig deeper into data, trends, and consumer behavior A strong interest in the areas of banking, consumer lending, and finance is paramount, with a curiosity as to why consumers act the way they do with their credit Strong work ethic with the passion for team success This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Consultant, Research & Consulting

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

TransUnion's Job Applicant Privacy Notice What We'll Bring TransUnion is a global information and insights company that makes trust possible in the modern economy. We do this by providing a comprehensive picture of each person so they can be reliably and safely represented in the marketplace. As a result, businesses and consumers can transact with confidence and achieve great things. We call this Information for Good.® A leading presence in more than 30 countries across five continents, TransUnion provides solutions that help create economic opportunity, great experiences and personal empowerment for hundreds of millions of people. What You'll Bring As consultant on our team, you will join a global group of statisticians, data scientists, and industry experts on a mission to extract insights from data and put them to good use. You will have an opportunity to be a part of a variety of analytical projects in a collaborative environment and be recognized for the work you deliver. TransUnion offers a culture of lifelong learning and as an associate here, your growth potential is limitless. The consultant role within the Research and Consulting team is responsible for delivering market-level business intelligence both to TransUnion’s senior management and to Financial Services customers. You will work on projects across international markets, including Canada, Hong Kong, UK, South Africa, Philippines, and Colombia. To be successful in this position, you must have good organizational skills, a strategic mindset, and a flexible predisposition. You will also be expected to operate independently and able to lead and present projects with minimal supervision. How You’ll Contribute You will develop a strong understanding of consumer credit data and how it applies to industry trends and research across different international markets You will dig in by extracting data and performing segmentation and statistical analyses on large population datasets (using languages such as R, SQL, and Python on Linux and PC computing platforms) You will conduct analyses and quantitative research studies designed to understand complex industry trends and dynamics, leveraging a variety of statistical techniques You will deliver analytic insights and recommendations in succinct and compelling presentations for internal and external customers at various levels including an executive audience; you may lead key presentations to clients You will perform multiple tasks simultaneously and deal with changing requirements and deadlines You will develop strong consulting skills to be able to help external customers by understanding their business needs and aligning them with TransUnion’s product offerings and capabilities You will help to cultivate an environment that promotes excellence, innovation, and a collegial spirit Through all these efforts, you will be a key contributor to driving the perception of TransUnion as an authority on lending dynamics and a worthwhile, trusted partner to our clients and prospects Impact You'll Make What you'll bring: A Bachelor’s or Master’s degree in Statistics, Applied Mathematics, Operations Research, Economics, or an equivalent discipline Minimum 3-5 years of experience in a relevant field, such as data analytics, lending, or risk strategy Advanced proficiency with one or more statistical programming languages such as R Advanced proficiency writing SQL queries for data extraction Experience with big data platforms (e.g. Apache Hadoop, Apache Spark) preferred Advanced experience with the MS Office suite, particularly Word, Excel, and PowerPoint Strong time management skills with the ability to prioritize and contribute to multiple assignments simultaneously Excellent verbal and written communication skills. You must be able to clearly articulate ideas to both technical and non-technical audiences Highly analytical mindset with the curiosity to dig deeper into data, trends, and consumer behavior A strong interest in the areas of banking, consumer lending, and finance is paramount, with a curiosity as to why consumers act the way they do with their credit Strong work ethic with the passion for team success This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Consultant, Research & Consulting

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description TBD Responsibilities Develop and implement software testing strategies, plans, and procedures. Define and execute test strategies, including manual test cases, automated scripts, and scenarios for web and API testing Identify, track, and ensure timely resolution of defects, including root cause analysis and process improvement Collaborate across development, and product teams to align on quality goals, timelines, and delivery expectations Participate in Agile ceremonies and sprint planning to advocate for quality throughout the development lifecycle Continuously enhance QA processes, tools, and standards to improve efficiency and product quality Identify edge cases, risks, and requirement gaps early in the planning process to strengthen story quality and test coverage Develop and maintain automated test suites to ensure consistent and reliable software quality Qualifications Bachelor’s degree in computer science or related field or equivalent experience 4+ years of proven experience in the software development industry, working in collaborative team environments 4+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python/C#/ Java 3+ years of hands-on experience testing and automating web services, including RESTful APIs 2+ years of experience in performance testing using tools such as Apache JMeter Strong written and verbal communication skills Good to Have- Experience in CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as RestAssured Knowledge of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description TBD Responsibilities Provide leadership, mentorship, and guidance to business analysts and QA team members on manual and automated testing Collaborate with product owners and business analysts to ensure user stories are well-defined, testable, and include measurable acceptance criteria Identify edge cases, risks, and requirement gaps early in the planning process to strengthen story quality and test coverage Participate in Agile ceremonies and sprint planning to advocate for quality throughout the development lifecycle Define and execute test strategies, including manual test cases, automated scripts, and scenarios for web and API testing Develop and maintain automated test suites and ensure effective integration into the CI/CD pipeline Identify, track, and ensure timely resolution of defects, including root cause analysis and process improvement Continuously enhance QA processes, tools, and standards to improve efficiency and product quality Collaborate across QA, development, and product teams to align on quality goals, timelines, and delivery expectations Support User Acceptance Testing (UAT) and incorporate customer feedback to ensure a high-quality release Ensure the final product meets user expectations for functionality, performance, and usabili D Qualifications Bachelor’s degree in computer science or a related field, or equivalent practical experience 6+ years of proven experience in the software development industry, working in collaborative team environments 6+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python, C#, or Java 5+ years of hands-on experience testing and automating web services, including RESTful APIs 3+ years of experience in performance testing using tools such as Apache JMeter 2+ years of experience in mobile web application testing automation using Appium Strong experience with object-oriented programming languages such as Java and C#/.NET Good to Have- Experience working with CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as JavaScript, RestAssured Solid understanding of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Demonstrate strong written and verbal communication skills Proven ability to learn new technologies and adapt in a dynamic environment Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice

Posted 1 week ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Senior AI/ML Scientist(IC) – Global Data Analytics, Technology (Maersk) This position will be based in India – Bangalore/Pune A.P. Moller - Maersk A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains. Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too. The Brief In this role as an AI/ML Scientist on the Global Data and Analytics (GDA) team, you will support the development of strategic, visibility-driven recommendation systems that serve both internal stakeholders and external customers. This initiative aims to deliver actionable insights that enhance supply chain execution, support strategic decision-making, and enable innovative service offerings. You should be able to design, develop, and implement machine learning models, conduct deep data analysis, and support decision-making with data-driven insights. Responsibilities include building and validating predictive models, supporting experiment design, and integrating advanced techniques like transformers, GANs, and reinforcement learning into scalable production systems. The role requires solving complex problems using NLP, deep learning, optimization, and computer vision. You should be comfortable working independently, writing reliable code with automated tests, and contributing to debugging and refinement. You’ll also document your methods and results clearly and collaborate with cross-functional teams to deliver high-impact AI/ML solutions that align with business objectives and user needs. What I'll be doing – your accountabilities? Independently design and develop scalable AI/ML solutions using advanced design patterns, models and algorithms to analyse complex datasets and generate actionable insights Collaborate with stakeholders to translate business needs into AI/ML solutions, evaluate user journeys and challenge business requirements to ensure seamless, value driven delivery and integration of solutions Apply innovative problem-solving techniques, leveraging advanced methodologies to find unique approaches to complex problems and improve outcomes Investigate and resolve complex challenges in data pipelines, modelling, and deployment to ensure reliable solutions that meet performance benchmarks Mentor team members through code reviews, model validations, pairing sessions, and knowledge-sharing sessions, and contribute to Communities of Practice Effectively communicate modelling, infrastructure, and deployment choices to both technical and non-technical stakeholders Maintain detailed, impactful documentation, covering methodologies, data pipelines, model performance, and key design decisions to enable reproducibility and scalability Ensure readiness for production releases, focusing on testing, monitoring, observability, and maintaining scalability and reusability of models for future projects Drive cross-team and cross-discipline initiatives to optimize workflows, remove redundant applications and processes, share best practices, and enhance collaboration between teams Demonstrate awareness of shared platform capabilities and actively identify opportunities to leverage them in designing efficient and scalable AI/ML solutions Foundational Skills Mastered Data Analysis, Data Science and and AI & Machine Learning concepts and can demonstrate this skill in complex scenarios Programming and Statistical Analysis Skills beyond the fundamentals and can demonstrate the skills in most situations without guidance. Specialized Skills To be able to understand beyond the fundamentals and can demonstrate in most situations without guidance: Data Validation and Testing Model Deployment Machine Learning Pipelines Deep Learning Natural Language Processing (NPL) Optimization & Scientific Computing Decision Modelling and Risk Analysis. Technical Documentation Qualifications & Requirements Bachelor's degree in B.E/BTech, preferably in computer science Experience with collaborative development workflow: IDE (Integrated Development Environment), Version control(github), CI/CD (e.g. automated tests in github actions) Communicate effectively with technical and non-technical audiences with experience in stakeholder management Structured, highly analytical mind-set and excellent problem-solving skills; Self-starter, highly motivated & Willing to share knowledge and work as a team. An individual who respects the opinion of others; yet can drive a decision though the team; Preferred Experiences 8+ years of years of relevant experience in the field of Data Engineering 5+ years of hands-on experience with Apache Spark, Python and SQL Experience working with large datasets and big data technologies to train and evaluate machine learning models. Experience with containerization: Kubernetes & Docker Expertise in building cloud native applications and data pipelines (Azure, Databricks, AWS, GCP) C Experience with common dashboarding and API technologies (PowerBI, Superset, Flask, FastAPI, etc As a performance-oriented company, we strive to always recruit the best person for the job – regardless of gender, age, nationality, sexual orientation or religious beliefs. We are proud of our diversity and see it as a genuine source of strength for building high-performing teams. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · 3+ years of experience in implementing analytical solutions using Palantir Foundry. · · preferably in PySpark and hyperscaler platforms (cloud services like AWS, GCP and Azure) with focus on building data transformation pipelines at scale. · · Team management: Must have experience in mentoring and managing large teams (20 to 30 people) for complex engineering programs. Candidate should have experience in hiring and nurturing talent in Palantir Foundry. · · Training: candidate should have experience in creating training programs in Foundry and delivering the same in a hands-on format either offline or virtually. · · At least 3 years of hands-on experience of building and managing Ontologies on Palantir Foundry. · · At least 3 years of experience with Foundry services: · · Data Engineering with Contour and Fusion · · Dashboarding, and report development using Quiver (or Reports) · · Application development using Workshop. · · Exposure to Map and Vertex is a plus · · Palantir AIP experience will be a plus · · Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry. · · Hands-on experience of managing data life cycle on at least one hyperscaler platform (AWS, GCP, Azure) using managed services or containerized deployments for data pipelines is necessary. · · Hands-on experience in working & building on Ontology (esp. demonstrable experience in building Semantic relationships). · · Proficiency in SQL, Python and PySpark. Demonstrable ability to write & optimize SQL and spark jobs. Some experience in Apache Kafka and Airflow is a prerequisite as well. · · Hands-on experience on DevOps on hyperscaler platforms and Palantir Foundry is necessary. · · Experience in MLOps is a plus. · · Experience in developing and managing scalable architecture & working experience in managing large data sets. · · Opensource contributions (or own repositories highlighting work) on GitHub or Kaggle is a plus. · · Experience with Graph data and graph analysis libraries (like Spark GraphX, Python NetworkX etc.) is a plus. · · A Palantir Foundry Certification (Solution Architect, Data Engineer) is a plus. Certificate should be valid at the time of Interview. · · Experience in developing GenAI application is a plus Mandatory skill sets: · At least 3 years of hands-on experience of building and managing Ontologies on Palantir Foundry. · At least 3 years of experience with Foundry services Preferred skill sets: Palantir Foundry Years of experience required: Experience 4 to 7 years ( 3 + years relevant) Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Science Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Palantir (Software) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture.The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities ETL Development – The CDP ETL C Database Engineer will be responsible for building pipelines to feed downstream data They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. Implementations s Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests– The CDP ETL C Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management – The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration s Process Improvement – The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling. ETL and FTP concepts. Advanced Analytics using SQL Functions. Cloud technologies - AWS, Snowflake. Able to decipher requirements, provide recommendations, and implement solutions within predefined. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management. When required, collaborate with the Business Solutions Analyst (BSA) to solidify. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow , and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives. Required Skills ETL – ETL tools such as Talend (Preferred, not required) DMExpress – Nice to have Informatica – Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL – Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages – Can demonstrate knowledge of any of the PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS – Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value Working knowledge of Code Repositories such as GIT, Win CVS, Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent. 4+ Years' experience. Excellent verbal C written communications skills. Self-Starter, highly motivated. Analytical mindset. Company Summary Zeta Global is a NYSE listed data-powered marketing technology company with a heritage of innovation and industry leadership. Founded in 2007 by entrepreneur David A. Steinberg and John Sculley, former CEO of Apple Inc and Pepsi-Cola, the Company combines the industry’s 3rd largest proprietary data set (2.4B+ identities) with Artificial Intelligence to unlock consumer intent, personalize experiences and help our clients drive business growth. Our technology runs on the Zeta Marketing Platform, which powers ‘end to end’ marketing programs for some of the world’s leading brands. With expertise encompassing all digital marketing channels – Email, Display, Social, Search and Mobile – Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable. Zeta Global is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, gender, ancestry, color, religion, sex, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. Zeta Global Recognized in Enterprise Marketing Software and Cross-Channel Campaign Management Reports by Independent Research Firm https://www.forbes.com/sites/shelleykohan/2024/06/1G/amazon-partners-with-zeta-global-to-deliver- gen-ai-marketing-automation/ https://www.cnbc.com/video/2024/05/06/zeta-global-ceo-david-steinberg-talks-ai-in-focus-at-milken- conference.html https://www.businesswire.com/news/home/20240G04622808/en/Zeta-Increases-3Q%E2%80%GG24- Guidance https://www.prnewswire.com/news-releases/zeta-global-opens-ai--data-labs-in-san-francisco-and-nyc- 300S45353.html https://www.prnewswire.com/news-releases/zeta-global-recognized-in-enterprise-marketing-software-and- cross-channel-campaign-management-reports-by-independent-research-firm-300S38241.html

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire AWS Professionals in the following areas : AWS Data Engineer JD As Below Primary skillsets :AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset- Any ETL Tool, Github, DevOPs(CI-CD) Experience: 3-4yrs Degree in computer science, engineering, or similar fields Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing , developing, testing and supporting data pipelines and applications. 3+ years working experience in data integration and pipeline development. 3+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems Databricks, Redshift experience is a major plus. 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server) Strong real-life experience in python development especially in PySpark in AWS Cloud environment Strong SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch Workflow management tools like Airflow AWS cloud services: RDS, AWS Lambda, AWS Glue, AWS Athena, EMR (equivalent tools in the GCP stack will also suffice) Good to Have : Snowflake, Palantir Foundry At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 week ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description: Job Title: Databricks Infrastructure Engineer Location: Hyderabad/Bengaluru Job Summary: We are looking for a skilled Databricks Infrastructure Engineer to design, build, and manage the cloud infrastructure that supports Databricks development efforts. This role will focus on creating and maintaining scalable, secure, and automated infrastructure environments using Terraform and other Infrastructure-as-Code (IaC) tools. The infrastructure will enable data engineers and developers to efficiently create notebooks, pipelines, and ingest data following the Medallion architecture (Bronze, Silver, Gold layers). The ideal candidate will have strong cloud engineering skills, deep knowledge of Terraform, and hands-on experience with Databricks platform provisioning. Key Responsibilities: Infrastructure Design & Provisioning: Design and implement scalable and secure infrastructure environments to support Databricks workloads aligned with the Medallion architecture. Develop and maintain Infrastructure-as-Code (IaC) scripts and templates using Terraform and/or ARM templates for provisioning Databricks workspaces, clusters, storage accounts, networking, and related Azure resources. Automate the setup of data ingestion pipelines, storage layers (Bronze, Silver, Gold), and access controls necessary for smooth data operations. Platform Automation & Optimization: Create automated deployment pipelines integrated with CI/CD tools (e.g., Azure DevOps, Jenkins) to ensure repeatable and consistent infrastructure provisioning. Optimize infrastructure configurations to balance performance, scalability, security, and cost-effectiveness. Monitor infrastructure health and perform capacity planning to support evolving data workloads. Implement and maintain backup, recovery, and disaster recovery strategies for Databricks environments. Optimize performance of Databricks clusters, jobs, and SQL endpoints. Automate routine administration tasks using scripting and orchestration tools. Troubleshoot platform issues, identify root causes, and implement solutions. Security & Governance: Implement security best practices including network isolation, encryption, identity and access management (IAM), and role-based access control (RBAC) within the infrastructure. Collaborate with governance teams to embed compliance and audit requirements into infrastructure automation. Collaboration & Support: Work closely with data engineers, data scientists, and platform administrators to understand infrastructure requirements and deliver solutions that enable efficient data engineering workflows. Provide documentation and training on infrastructure setup, usage, and best practices. Troubleshoot infrastructure issues and coordinate with cloud and platform support teams for resolution. Stay up to date with Databricks features, releases, and best practices Required Qualifications: 10+ years of experience in Databricks and cloud infrastructure engineering, preferably with Azure Strong hands-on experience writing Infrastructure-as-Code using Terraform; experience with ARM templates or CloudFormation is a plus. Practical knowledge of provisioning and managing Databricks environments and associated cloud resources. Familiarity with Medallion architecture and data lake house concepts. Experience with CI/CD pipeline creation and automation tools such as Azure DevOps, Jenkins, or GitHub Actions. Solid understanding of cloud networking, storage, security, and identity management. Proficiency in scripting languages such as Python, Bash, or PowerShell. Strong collaboration and communication skills to work across cross-functional teams. Preferred Skills: Prior experience working with Databricks platform, including workspace and cluster management. Knowledge of data governance tools and practices. Experience with monitoring and logging tools (e.g., Azure Monitor, CloudWatch). Exposure to containerization and orchestration technologies such as Docker and Kubernetes. Understanding of data ingestion frameworks and pipeline orchestration tools like Apache Airflow or Azure Data Factory. Weekly Hours: 40 Time Type: Regular Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Mandatory Skills 8-10 years of exp. Strong proficiency in Python, SQL and experience with data processing libraries (e.g., Pandas, PySpark). Familiarity with Generative AI frameworks like LangChain, LangGraph, or similar tools. Experience integrating APIs from pre-trained AI models (e.g., OpenAI, Cohere, Hugging Face). Solid understanding of data structures, algorithms, and distributed systems. Experience with vector databases (e.g., Pinecone, Postgres). Familiarity with prompt engineering and chaining AI workflows. Understanding of MLOps practices for deploying and monitoring AI applications. Strong problem-solving skills and ability to work in a collaborative environment. Good To Have Experience with Streamlit to build application front-end. Job Description We are looking for an experienced Python Developer with expertise in Spark, SQL, data processing, and building Generative AI applications. The ideal candidate will focus on leveraging existing AI models and frameworks (e.g., LangChain, LangGraph) to create innovative, data-driven solutions. This role does not involve designing new AI models but rather integrating and utilizing pre-trained models to solve real-world problems. Key Responsibilities Develop and deploy Generative AI applications using Python and frameworks like LangChain or LangGraph. Work with large-scale data processing frameworks like Apache Spark, SQL to prepare and manage data pipelines. Integrate pre-trained AI models (e.g., OpenAI, Hugging Face, Llama) into scalable applications. Understands ML - NLP concepts and algorithms with an exposure to Scikit-learn Pytorch. Collaborate with data engineers and product teams to design AI-driven solutions. Optimize application performance and ensure scalability in production environments.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies