Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hi, Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune. We are looking for candidates with 10+ years of experience who is currently working as a Data Architect. Job Description: We are seeking a highly skilled and experienced Cloud Data Architect to design, implement, and manage scalable, secure, and efficient cloud-based data solutions. The ideal candidate will possess a strong combination of technical expertise, analytical skills, and the ability to collaborate effectively with cross-functional teams to translate business requirements into technical solutions. Key Responsibilities: Design and implement data architectures, including data pipelines, data lakes, and data warehouses, on cloud platforms. Develop and optimize data models (e.g., star schema, snowflake schema) to support business intelligence and analytics. Leverage big data technologies (e.g., Hadoop, Spark, Kafka) to process and analyze large-scale datasets. Manage and optimize relational and NoSQL databases for performance and scalability. Develop and maintain ETL/ELT workflows using tools like Apache NiFi, Talend, or Informatica. Ensure data security and compliance with regulations such as GDPR and CCPA. Automate infrastructure deployment using CI/CD pipelines and Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Collaborate with analytics teams to integrate machine learning frameworks and visualization tools (e.g., Tableau, Power BI). Provide technical leadership and mentorship to team members. Interested candidates for above position kindly share your CVs on sneh.ne@peoplefy.com
Posted 1 week ago
12.0 - 18.0 years
0 Lacs
noida, uttar pradesh
On-site
As a seasoned Manager - Data Engineering with 12-18 years of total experience in data engineering, including 3-5 years in a leadership/managerial role, you will lead complex data platform implementations using Databricks or the Apache data stack. Your key responsibilities will include leading high-impact data engineering engagements for global clients, delivering scalable solutions, and driving digital transformation. You must have hands-on experience in Databricks OR core Apache stack (Spark, Kafka, Hive, Airflow, NiFi, etc.) and expertise in one or more cloud platforms such as AWS, Azure, or GCP, ideally with Databricks on cloud. Strong programming skills in Python, Scala, and SQL are essential, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is also required. Your role will involve leading the architecture, development, and deployment of modern data platforms using Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks. Additionally, you will own delivery accountability for data engineering programs across BFSI, telecom, healthcare, or manufacturing clients. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key part of your responsibilities. Ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance is crucial. You will manage and mentor a team of 10-25 engineers, conducting performance reviews, capability building, and coaching. Supporting presales activities including solutioning, technical proposals, and client workshops will also be part of your role. At GlobalLogic, we prioritize a culture of caring and continuous learning and development. You'll have the opportunity to work on interesting and meaningful projects that have a real impact. We offer balance and flexibility, ensuring that you can achieve the perfect equilibrium between work and life. As a high-trust organization, integrity is key, and you can trust that you are part of a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. As part of our team, you'll collaborate with forward-thinking companies to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 1 week ago
12.0 - 18.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for an experienced Manager Data Engineering with expertise in Databricks or the Apache data stack to lead complex data platform implementations. As the Manager Data Engineering, you will play a crucial role in spearheading high-impact data engineering projects for global clients, delivering scalable solutions, and catalyzing digital transformation. You should have a total of 12-18 years of experience in data engineering, with at least 3-5 years in a leadership or managerial capacity. Hands-on experience in Databricks or core Apache stack components such as Spark, Kafka, Hive, Airflow, NiFi, etc., is essential. Proficiency in one or more cloud platforms like AWS, Azure, or GCP is preferred, ideally with Databricks on the cloud. Strong programming skills in Python, Scala, and SQL are required, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is advantageous. Your responsibilities will include leading the architecture, development, and deployment of modern data platforms utilizing Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks, ensuring delivery accountability for data engineering programs across various industries. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key aspect of your role. Additionally, you will be responsible for ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance. Managing and mentoring a team of 10-25 engineers, conducting performance reviews, capability building, and coaching will also be part of your responsibilities. At GlobalLogic, we prioritize a culture of caring where people come first. You will have opportunities for continuous learning and development, engaging in interesting and meaningful work that makes an impact. We believe in providing balance and flexibility to help you integrate your work and life effectively. GlobalLogic is a high-trust organization built on integrity and ethical values, providing a safe and reliable environment for your professional growth and success. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with leading companies worldwide to create innovative digital products and experiences. Join us to be a part of transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 week ago
6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Data Scientist, Product Data & Analytics Senior Data Scientist, Product Data & Analytics Our Vision: Product Data & Analytics team builds internal analytic partnerships, strengthening focus on the health of the business, portfolio and revenue optimization opportunities, initiative tracking, new product development and Go-To Market strategies. We are a hands-on global team providing scalable end-to-end data solutions by working closely with the business. We influence decisions across Mastercard through data driven insights. We are a team on analytics engineers, data architects, BI developers, data analysts and data scientists, and fully manage our own data assets and solutions. Are you excited about Data Assets and the value they bring to an organization? Are you an evangelist for data driven decision making? Are you motivated to be part of a Global Analytics team that builds large scale Analytical Capabilities supporting end users across the continents? Are you interested in proactively looking to improve data driven decisions for a global corporation? Role Responsible for developing data-driven innovative scalable analytical solutions and identifying opportunities to support business and client needs in a quantitative manner and facilitate informed recommendations / decisions. Accountable for delivering high quality project solutions and tools within agreed upon timelines and budget parameters and conducting post- implementation reviews. Contributes to the development of custom analyses and solutions, derives insights from extracted data to solve critical business questions. Activities include developing and creating predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Able to develop AI/ML capabilities, as needed on large volumes of data to support analytics and reporting needs across products, markets and services. Able to build end to end reusable, multi-purpose AI models to drive automated insights and recommendations. Leverage open and closed source technologies to solve business problems. Work closely with global & regional teams to architect, develop, and maintain advanced reporting and data visualization capabilities on large volumes of data to support analytics and reporting needs across products, markets, and services. Support initiatives in developing predictive models, behavioural segmentation frameworks, profitability analyses, ad hoc reporting, and data visualizations. Translates client/ stakeholder needs into technical analyses and/or custom solutions in collaboration with internal and external partners, derive insights and present findings and outcomes to clients/stakeholders to solve critical business questions. Create repeatable processes to support development of modelling and reporting Delegate and reviews work for junior level colleagues to ensure downstream applications and tools are not compromised or delayed. Serves as a mentor for junior-level colleagues, and develops talent via ongoing technical training, peer review etc. All About You 6-8 years of experience in data management, data mining, data analytics, data reporting, data product development and quantitative analysis. Advanced SQL skills, ability to write optimized queries for large data sets. Experience on Platforms/Environments: Cloudera Hadoop, Big data technology stack, SQL Server, Microsoft BI Stack, Cloud, Snowflake, and other relevant technologies. Data visualization tools (Tableau, Domo, and/or Power BI/similar tools) experience is a plus Experience with data validation, quality control and cleansing processes to new and existing data sources. Experience on Classical and Deep Machine Learning Algorithms like Logistic Regression, Decision trees, Clustering (K-means, Hierarchical and Self-organizing Maps), TSNE, PCA, Bayesian models, Time Series ARIMA/ARMA, Random Forest, GBM, KNN, SVM, Bayesian, Text Mining techniques, Multilayer Perceptron, Neural Networks - Feedforward, CNN, NLP, etc. Experience on Deep Learning algorithm techniques, open-source tools and technologies, statistical tools, and programming environments such as Python, R, and Big Data platforms such as Hadoop, Hive, Spark, GPU Clusters for deep learning. Experience in automating and creating data pipeline via tools such as Alteryx, SSIS. Nifi is a plus Financial Institution or a Payments experience a plus Additional Competencies Excellent English, quantitative, technical, and communication (oral/written) skills. Ownership of end-to-end Project Delivery/Risk Mitigation Virtual team management and manage stakeholders by influence Analytical/Problem Solving Able to prioritize and perform multiple tasks simultaneously Able to work across varying time zone. Strong attention to detail and quality Creativity/Innovation Self-motivated, operates with a sense of urgency. In depth technical knowledge, drive, and ability to learn new technologies. Must be able to interact with management, internal stakeholders Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must. Abide by Mastercard’s security policies and practices. Ensure the confidentiality and integrity of the information being accessed. Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. #AI Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-244065
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Tesco India • Bengaluru, Karnataka, India • Hybrid • Full-Time • Permanent • Apply by 23-Jul-2025 About the role Become a quick learner and be more proactive in understanding the wider Business requirements and linking to the various other concepts in the Domain. Help implement better solutions independently and faster with better ownership. Help automating manual operational tasks and focus on creating reusable assets and propel innovation. Work very closely with team members and able to have healthy relationship Be innovative and able to come up with ideas and reusable components & frameworks Should be ready to Support 24x7 as per the Rota Should be based in Bangalore or is in the process of moving, should be ready to come to Tesco office when asked for. What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles -simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Should have worked on building large scale distributed systems Should have lead a team in tech lead/module lead role Should have good mentorship experience Should have good communication and very good documentation skills Should show maturity & understand the requirement and convert to high quality technical requirements Should Code and Design end to end data flow and deliver on time Be resilient & flexible to work across multiple teams and internal teams Should help to Implement best practices in Data Architecture and Enterprise Software Development. Should have extensive experience in working in Agile Data Engineering Teams Work very closely with Engineering Manager, TPM, Product Manager and Stake Holders. You will need Basic concepts of Data Engineering, Ingestion from diverse sources and file formats, Hadoop, Data Warehousing, Designing & Implementing large scale Distributed Data platforms & Data Lakes Building distributed platforms or services SQL, Spark, Query Tuning & Performance Optimization General advanced Scala or Java experience (e.g Functional Programming, using Case classes, Complex Data Structures & Algorithms) Experience on SOLID & Dry principles and Good Software Architecture & Design Experience Languages: Python, Java, Scala Good experience in Big Data Unit, System, Integration & Regression Testing Devops experience in Jenkins, Maven ,Github, Artifactory/Jfrog, CI/CD Big Data Processing: Hadoop, Sqoop, Spark and Spark Streaming Hadoop Distributions: Cloudea / Hortonworks experience Data Streaming: Experience on Kafka and Spark Streaming Data Validation & Data Quality Data Lake & Medallion Architecture Shell Scripting & Automation using Ansible or related Configuration management tools Agile processes & tools like Jira & Confluence Code Management tools like Git File formats like ORC, Avro, Parquet, Json & CSV Big Data Orchastration: Nifi, Airflow, Spark on Kubernetes, Yarn, Oozie, Azkaban About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues Tesco Technology Today, our Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India. In India, our Technology division includes teams dedicated to Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and other roles. At Tesco, our retail platform comprises a wide array of capabilities, value propositions, and products, essential for crafting exceptional retail experiences for our customers and colleagues across all channels and markets. This platform encompasses all aspects of our operations – from identifying and authenticating customers, managing products, pricing, promoting, enabling customers to discover products, facilitating payment, and ensuring delivery. By developing a comprehensive Retail Platform, we ensure that as customer touchpoints and devices evolve, we can consistently deliver seamless experiences. This adaptability allows us to respond flexibly without the need to overhaul our technology, thanks to the creation of capabilities we have built.
Posted 1 week ago
8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Title: Senior DB Developer – Sports/Healthcare Location: Ahmedabad, Gujarat. Job Type: Full-Time. Job Description: We are seeking an exceptional Senior Database Developer with 8+ years of expertise who will play a critical role in design and development of a scalable, configurable, and customizable platform. Our new Senior Database Developer will help with the design and collaborate with cross-functional teams and provide data solutions for delivering high-performance applications. If you are passionate about bringing innovative technology to life, owning and solving problems in an independent, fail fast and highly supportive environment, and working with a creative and dynamic team, we want to hear from you. This role requires a strong understanding of enterprise applications and large-scale data processing platforms. Key Responsibilities: ● Design and architect scalable, efficient, high-availability and secure database solutions to meet business requirements. ● Designing the Schema and ER Diagram for horizontal scalable architecture ● Strong knowledge of NoSQL / MongoDB ● Knowledge of ETL Tools for data migration from source to destination. ● Establish database standards, procedures, and best practices for data modelling, storage, security, and performance. ● Implement data partitioning, sharding, and replication for high-throughput systems. ● Optimize data lake, data warehouse, and NoSQL solutions for fast retrieval. ● Collaborate with developers and data engineers to define data requirements and optimize database performance. ● Implement database security policies ensuring compliance with regulatory standards (e.g., GDPR, HIPAA). ● Optimize and tune databases for performance, scalability, and availability. ● Design disaster recovery and backup solutions to ensure data protection and business continuity. ● Evaluate and implement new database technologies and frameworks as needed. ● Provide expertise in database migration, transformation, and modernization projects. ● Conduct performance analysis and troubleshooting of database-related issues. ● Document database architecture and standards for future reference. Required Skills and Qualifications: ● 8+ years of experience in database architecture, design, and management. ● Experience with AWS (Amazon Web Services) and similar platforms like Azure and GCP (Google Cloud Platform). ● Experience deploying and managing applications, utilizing various cloud services (compute, storage, databases, etc.) ● Experience with specific services like EC2, S3, Lambda (for AWS) ● Proficiency with SQL and NoSQL databases (e.g., PostgreSQL, MySQL, Oracle, MongoDB , Cassandra). ● MongoDB and NoSQL Experience is a big added advantage. ● Expertise in data modelling, schema design, indexing, and partitioning. ● Experience with ETL processes, data warehousing, and big data technologies (e.g. Apache NiFi, Airflow, Redshift, Snowflake, Hadoop). ● Proficiency in database performance tuning, optimization, and monitoring tools. ● Strong knowledge of data security, encryption, and compliance frameworks. ● Excellent analytical, problem-solving, and communication skills. ● Proven experience in database migration and modernization projects. Preferred Qualifications: ● Certifications in cloud platforms (AWS, GCP, Azure) or database technologies. ● Experience with machine learning and AI-driven data solutions. ● Knowledge of graph databases and time-series databases. ● Familiarity with Kubernetes, containerized databases, and microservices architecture. Education: ● Bachelor's or Master’s degree in Computer Science , Software Engineering , or related technical field. Why Join Us? ● Be part of an exciting and dynamic project in the sports/health data domain. ● Work with cutting-edge technologies and large-scale data processing systems. ● Collaborative, fast-paced team environment with opportunities for professional growth. Competitive salary, bonus, and benefits package
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Software Developer specializing in Java and React/Angular, SQL, and API, your main role will involve the design, development, and maintenance of software applications utilizing Java and its related technologies. Your proficiency in React or Angular will be advantageous for creating modern and dynamic user interfaces for web applications. Alongside your Java skills, it is essential to possess a strong understanding of HTML, CSS, and JavaScript, as well as experience with frameworks like Angular or React. For this position, it is crucial to have knowledge in various areas such as software design patterns, Unix environment, database technologies including SQL and NoSQL, and working with databases like Oracle and Netezza. Experience in RESTful web services, API design, full-stack Java development, and familiarity with Angular or React would be highly beneficial. Additional assets include knowledge of Redis and experience with Nifi and APIs. Being part of Agile teams, you will contribute your expertise in Data Engineering and implementing end-to-end DW projects in a Big Data environment. Strong analytical abilities are required for debugging production issues, offering root cause analysis, and implementing mitigation plans. Effective communication, both verbal and written, is essential, along with excellent relationship-building, collaboration, and organizational skills. In this role, you will need to multitask across various projects, interact with internal and external resources, and provide technical guidance to junior team members. Your high-energy, detail-oriented, and proactive approach, combined with the ability to work under pressure independently, will be invaluable. Initiative and self-motivation are key qualities for driving results. You should also be quick to learn and apply new technologies, conducting POCs to identify optimal solutions for problem statements. The flexibility to collaborate in diverse and geographically distributed project teams within a matrix-based environment is also essential for this role.,
Posted 1 week ago
2.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Tiger Analytics is a global AI and analytics consulting firm with a team of over 2800 professionals focused on using data and technology to solve complex problems that impact millions of lives worldwide. Our culture is centered around expertise, respect, and a team-first mindset. Headquartered in Silicon Valley, we have delivery centers globally and offices in various cities across India, the US, UK, Canada, and Singapore, along with a significant remote workforce. At Tiger Analytics, we are certified as a Great Place to Work. Joining our team means being at the forefront of the AI revolution, working with innovative teams that push boundaries and create inspiring solutions. We are currently looking for an Azure Big Data Engineer to join our team in Chennai, Hyderabad, or Bangalore. As a Big Data Engineer (Azure), you will be responsible for building and implementing various analytics solutions and platforms on Microsoft Azure using a range of Open Source, Big Data, and Cloud technologies. Your typical day might involve designing and building scalable data ingestion pipelines, processing structured and unstructured data, orchestrating pipelines, collaborating with teams and stakeholders, and making critical tech-related decisions. To be successful in this role, we expect you to have 4 to 9 years of total IT experience with at least 2 years in big data engineering and Microsoft Azure. You should be proficient in technologies such as Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Strong coding skills in SQL, Python, or Scala/Java are essential, as well as experience with big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, and Elastic Search. Knowledge of file formats such as Delta Lake, Avro, Parquet, JSON, and CSV is also required. Ideally, you should have experience in building REST APIs, working on Data Lake or Lakehouse projects, supporting BI and Data Science teams, and following Agile and DevOps processes. Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) would be a valuable addition to your profile. At Tiger Analytics, we value diversity and inclusivity, and we encourage individuals with different skills and qualities to apply, even if they do not meet all the criteria for the role. We are committed to providing equal opportunities and fostering a culture of listening, trust, respect, and growth. Please note that the job designation and compensation will be based on your expertise and experience, and our compensation packages are competitive within the industry. If you are passionate about leveraging data and technology to drive impactful solutions, we would love to stay connected with you.,
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title : Senior Java Technical Consultant-Chennai/Trivandrum -Immediate Joiner ONLY from Financial Services Domain Work Location : Chennai (Hybrid)-Work From Office Near Toll Gate, Near Jain College, Rajiv Gandhi Salai, Okkiyam Thoraipakkam, Chennai - Location : Trivandrum (Hybrid)-Work From Office, Technopark Campus, Thiruvananthapuram, Kerala 695581 Work Mode : Hybrid Work From Office (Regular office hours) Required Experience : 8+ years in Java/Spring Boot, PostgreSQL, NiFi or Flink- Mandatory Working Hours : Normal working hours -Hybrid (Monday to Friday, ) Start Date : Immediate-Immediate Joiners Only Mandatory Background Verification : Mandatory through a third-party verification process. Please Note : We are looking for Immediate Joiner Only Important Note : Third Party Background verification process Check is mandatory for this position. Job Description We are looking for an experienced Lead Java Developer to join our team for a high-impact role in the financial services domain. The ideal candidate should have a strong background in system integration, production support, and deep expertise in Java technologies. Key Responsibilities Lead and support the technical implementation of a financial services product. Collaborate with cross-functional teams for end-to-end system integration using APIs, file-based interfaces, and Kafka. Design and deploy scalable batch processes and pipelines; support go-live activities. Troubleshoot and optimize system performance across development, UAT, and production environments. Provide strong production support, including batch job management and critical issue resolution. Engage in banking transformation initiatives, especially in Commercial & Corporate Banking. Ensure solutions are aligned with business workflows, particularly relevant to the Middle East region. Required Skills Proficiency in Java, Spring Boot, PostgreSQL, Apache NiFi or Apache Flink Strong experience in system integration and Kafka-based architecture Hands-on with deployment pipelines, batch operations, and monitoring tools Excellent problem-solving and troubleshooting skills Prior experience with Commercial & Corporate Banking workflows is a plus Understanding of banking transformation programs is preferred Why Join Us ? Opportunity to work on impactful financial transformation projects Collaborate with global banking clients, particularly in the Middle East Hybrid work culture with a strong focus on innovation and growth (ref:hirist.tech)
Posted 1 week ago
0 years
10 - 20 Lacs
Bengaluru
On-site
Integration Consultant – o9 Key Responsibilities Play the integration consultant role on o9 implementation projects. Understand o9 platform’s data model (table structures, linkages, pipelines, optimal designs) for designing various planning use cases. Review and analyze the data provided by customer along with its technical/functional intent and inter-dependencies. Participate in the technical design, data requirements gathering, making recommendations in case of inaccurate or missing data. Work on designing and creating batch schedules based on frequency and configuration settings for daily/weekly/quarterly/yearly batches. E2E integration implementation from partner system to o9 platform. Technical Experience Must have experience on SQL, PySpark, Python, Spark SQL and ETL tools. Proficiency in database (SQL Server, Oracle etc ). Knowledge of DDL, DML, stored procedures. At least one E2E o9 integration implementation experience is required. Good to have experience in Airflow, Dalta Lake, Nifi, Kafka. Any API based integration experience will be added advantage. Professional Attributes Proven ability to work creatively and analytically in a problem-solving environment. Proven ability to build, manage and foster a team-oriented environment. Excellent problem-solving skills with excellent communication written/oral, interpersonal skills. Strong collaborator- team player- and individual contributor. Educational Qualification BE/BTech/MCA/Bachelor's degree/master's degree in computer science and related fields of work are preferred. Job Type: Contractual / Temporary Contract length: 12 months Pay: ₹90,000.00 - ₹170,000.00 per month Work Location: In person
Posted 2 weeks ago
0 years
6 - 7 Lacs
Indore
Remote
Cloud Platform: Amazon Web Services (AWS) – The backbone providing robust, scalable, and secure infrastructure. Ingestion Layer (Data Ingestion Frameworks): o Apache Nifi: For efficient, real-time data routing, transformation, and mediation from diverse sources. o Data Virtuality: Facilitates complex ETL and data virtualization, creating a unified view of disparate data. Data Frameworks (Data Processing & Microservices): o Rules Engine & Eductor (In-house Tools - Scala, Python): Our proprietary microservices for specialized data handling and business logic automation. o Kafka: Our high-throughput, fault-tolerant backbone for real-time data streaming and event processing. Analytics Layer (Analytics Services & Compute): o Altair: For powerful data visualization and interactive analytics. o Apache Zeppelin: Our interactive notebook for collaborative data exploration and analysis. o Apache Spark: Our unified analytics engine for large-scale data processing and machine learning workloads. Data Presentation Layer (Client-facing & APIs): o Client Services (React, TypeScript): For dynamic, responsive, and type-safe user interfaces. o Client APIs (Node.js, Nest.js): For high-performance, scalable backend services. Access Layer: o API Gateway (Amazon API Gateway): Manages all external API access, ensuring security, throttling, and routing. o AWS VPN (Clients, Site-to-Site, OpenVPN): Secure network connectivity. o Endpoints & Service Access (S3, Lambda): Controlled access to core AWS services. o DaaS (Data-as-a-Service - Dremio, Data Virtuality, PowerBI): Empowering self-service data access and insights. Security Layer: o Firewall (AWS WAF): Protects web applications from common exploits. o IdM, IAM (Keycloak, AWS Cognito): Robust identity and access management. o Security Groups & Policy (AWS): Network-level security and granular access control. o ACLs (Access Control Lists - AWS): Fine-grained control over network traffic. o VPCs (Virtual Private Clouds - AWS): Isolated and secure network environments. Data Layer (Databases & Storage): o OpenSearch Services: For powerful search, analytics, and operational data visualization. o Data Warehouse – AWS Redshift: Our primary analytical data store. o Databases (PostgreSQL, MySQL, OpenSearch): Robust relational and search-optimized databases. o Storage (S3 Object Storage, EBS, EFS): Highly scalable, durable, and cost-effective storage solutions. Compute & Orchestration: o EKS (Amazon Elastic Kubernetes Services): Manages our containerized applications, providing high availability and scalability for microservices. Job Types: Full-time, Contractual / Temporary Contract length: 6 months Pay: ₹50,000.00 - ₹60,000.00 per month Schedule: Monday to Friday Weekend availability Work Location: Remote
Posted 2 weeks ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
Posted 2 weeks ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
Posted 2 weeks ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Grade Level (for internal use): 10 The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What we’re Looking For: Bachelor's degree in Computer Science or Equivalent 7+ years’ related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience a plus. Exposure to Data Engineering and Big Data technologies like Hadoop, Big data processing engines/Scala, Nifi and ETL is a plus. Experience of Container platforms is a plus Experience working in cloud computing environment like AWS, Azure , GCP etc. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 317843 Posted On: 2025-07-20 Location: Hyderabad, Telangana, India
Posted 2 weeks ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 10 The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What we’re Looking For: Bachelor's degree in Computer Science or Equivalent 7+ years’ related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience a plus. Exposure to Data Engineering and Big Data technologies like Hadoop, Big data processing engines/Scala, Nifi and ETL is a plus. Experience of Container platforms is a plus Experience working in cloud computing environment like AWS, Azure , GCP etc. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 317843 Posted On: 2025-07-20 Location: Hyderabad, Telangana, India
Posted 2 weeks ago
7.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 10 The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What we’re Looking For: Bachelor's degree in Computer Science or Equivalent 7+ years’ related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience a plus. Exposure to Data Engineering and Big Data technologies like Hadoop, Big data processing engines/Scala, Nifi and ETL is a plus. Experience of Container platforms is a plus Experience working in cloud computing environment like AWS, Azure , GCP etc. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 317843 Posted On: 2025-07-20 Location: Hyderabad, Telangana, India
Posted 2 weeks ago
0.0 years
0 Lacs
Hyderabad, Telangana
On-site
About the Role: Grade Level (for internal use): 10 The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities: Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What we’re Looking For: Basic Qualifications: Bachelor's degree in Computer Science or Equivalent 7+ years’ related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience a plus. Exposure to Data Engineering and Big Data technologies like Hadoop, Big data processing engines/Scala, Nifi and ETL is a plus. Experience of Container platforms is a plus Experience working in cloud computing environment like AWS, Azure , GCP etc. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 317843 Posted On: 2025-07-20 Location: Hyderabad, Telangana, India
Posted 2 weeks ago
0.0 years
0 Lacs
Hyderabad, Telangana
On-site
Senior Software Engineer Hyderabad, India; Ahmedabad, India; Gurgaon, India Information Technology 317843 Job Description About The Role: Grade Level (for internal use): 10 The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities: Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What we’re Looking For: Basic Qualifications: Bachelor's degree in Computer Science or Equivalent 7+ years’ related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience a plus. Exposure to Data Engineering and Big Data technologies like Hadoop, Big data processing engines/Scala, Nifi and ETL is a plus. Experience of Container platforms is a plus Experience working in cloud computing environment like AWS, Azure , GCP etc. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 317843 Posted On: 2025-07-20 Location: Hyderabad, Telangana, India
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
Job Description: We are looking for a skilled PySpark Developer having 4-5 or 2-3 years of experience to join our team. As a PySpark Developer, you will be responsible for developing and maintaining data processing pipelines using PySpark, Apache Spark's Python API. You will work closely with data engineers, data scientists, and other stakeholders to design and implement scalable and efficient data processing solutions. Bachelor's or Master's degree in Computer Science, Data Science, or a related field is required. The ideal candidate should have strong expertise in the Big Data ecosystem including Spark, Hive, Sqoop, HDFS, Map Reduce, Oozie, Yarn, HBase, Nifi. The candidate should be below 35 years of age and have experience in designing, developing, and maintaining PySpark data processing pipelines to process large volumes of structured and unstructured data. Additionally, the candidate should collaborate with data engineers and data scientists to understand data requirements and design efficient data models and transformations. Optimizing and tuning PySpark jobs for performance, scalability, and reliability is a key responsibility. Implementing data quality checks, error handling, and monitoring mechanisms to ensure data accuracy and pipeline robustness is crucial. The candidate should also develop and maintain documentation for PySpark code, data pipelines, and data workflows. Experience in developing production-ready Spark applications using Spark RDD APIs, Data frames, Datasets, Spark SQL, and Spark Streaming is required. Strong experience of HIVE Bucketing and Partitioning, as well as writing complex hive queries using analytical functions, is essential. Knowledge in writing custom UDFs in Hive to support custom business requirements is a plus. If you meet the above qualifications and are interested in this position, please email your resume, mentioning the position applied for in the subject column at: careers@cdslindia.com.,
Posted 2 weeks ago
4.0 - 5.0 years
0 Lacs
Greater Kolkata Area
On-site
Role : Data Integration Specialist Experience : 4 - 5 Years Location : India Employment Type : Full-time About The Role We are looking for a highly skilled and motivated Data Integration Specialist with 4 to 5 years of hands-on experience to join our growing team in India. In this role, you will be responsible for designing, developing, implementing, and maintaining robust data pipelines and integration solutions that connect disparate systems and enable seamless data flow across the enterprise. You'll play a crucial part in ensuring data availability, quality, and consistency for various analytical and operational needs. Key Responsibilities ETL/ELT Development : Design, develop, and optimize ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes using industry-standard tools and technologies. Data Pipeline Construction : Build and maintain scalable and efficient data pipelines from various source systems (databases, APIs, flat files, streaming data, cloud sources) to target data warehouses, data lakes, or analytical platforms. Tool Proficiency : Hands-on experience with at least one major ETL tool such as Talend, Informatica PowerCenter, SSIS, Apache NiFi, IBM DataStage, or similar platforms. Database Expertise : Proficient in writing and optimizing complex SQL queries across various relational databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL) and NoSQL databases. Cloud Data Services : Experience with cloud-based data integration services on platforms like AWS (Glue, Lambda, S3, Redshift), Azure (Data Factory, Synapse Analytics), or GCP (Dataflow, BigQuery) is highly desirable. Scripting : Develop and maintain scripts (e.g., Python, Shell scripting) for automation, data manipulation, and orchestration of data processes. Data Modeling : Understand and apply data modeling concepts (e.g., dimensional modeling, Kimball/Inmon methodologies) for data warehousing solutions. Data Quality & Governance : Implement data quality checks, validation rules, and participate in establishing data governance best practices to ensure data accuracy and reliability. Performance Tuning : Monitor, troubleshoot, and optimize data integration jobs and pipelines for performance, scalability, and reliability. Collaboration & Documentation : Work closely with data architects, data analysts, business intelligence developers, and business stakeholders to gather requirements, design solutions, and deliver data assets. Create detailed technical documentation for data flows, mappings, and transformations. Problem Solving : Identify and resolve complex data-related issues, ensuring data integrity and consistency. Qualifications Education : Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related quantitative field. Experience : 4 to 5 years of dedicated experience in data integration, ETL development, or data warehousing. Core Skills : Strong proficiency in SQL and at least one leading ETL tool (as listed above). Programming : Hands-on experience with Python or Shell scripting for data manipulation and automation. Databases : Solid understanding of relational database concepts and experience with various database systems. Analytical Thinking : Excellent analytical, problem-solving, and debugging skills with attention to detail. Communication : Strong verbal and written communication skills to articulate technical concepts to both technical and non-technical audiences. Collaboration : Ability to work effectively in a team environment and collaborate with cross-functional teams. Preferred/Bonus Skills Experience with real-time data integration or streaming technologies (e.g., Kafka, Kinesis). Knowledge of Big Data technologies (e.g., Hadoop, Spark). Familiarity with CI/CD pipelines for data integration projects. Exposure to data visualization tools (e.g., Tableau, Power BI). Experience in specific industry domains (e.g., Finance, Healthcare, Retail) (ref:hirist.tech)
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineering Manager at Micron Technology Inc., you will play a crucial role within the Technology Solutions group of the Smart Manufacturing and AI organization. Your responsibilities will involve working closely with Micron's Front End Manufacturing and Planning Ops business area, focusing on data engineering, Machine Learning, and advanced analytics solutions. We are seeking a leader with a strong technical background in Big Data and Cloud Data warehouse technologies, particularly in Cloud data warehouse platforms like Snowflake and GCP, monitoring solutions such as Splunk, and automation and machine learning using Python. Your primary tasks will include leading a team of Data Engineers, providing technical and people leadership, and ensuring the successful delivery of critical projects and production support. You will engage team members in their career development, maintain a positive work culture, and participate in the design, architecture review, and deployment of big data and cloud data warehouse solutions. Additionally, you will collaborate with key project stakeholders, analyze project needs, and translate requirements into technical specifications for the team of data engineers. To excel in this role, you should have a solid background in developing, delivering, and supporting big data engineering and advanced analytics solutions, with at least 10 years of experience in the field. Managing or leading data engineering teams for 6+ years and hands-on experience in building Cloud Data-centric solutions in GCP or other cloud platforms for 4-5 years is essential. Proficiency in Python programming, experience with Spark, ELT or ETL techniques, database management systems like SQL Server and Snowflake, and strong domain knowledge in Manufacturing Planning and Scheduling data are highly desired. Furthermore, you should possess intermediate to advanced programming skills, excellent communication abilities, and a passion for data and information. Being self-motivated, adaptable to a fast-paced environment, and having a Bachelor's degree in Computer Science, Management Information Systems, or related fields are prerequisites for this role. Micron Technology is a pioneering industry leader in memory and storage solutions, dedicated to transforming how information enriches lives globally. Our commitment to innovation, technology leadership, and operational excellence drives us to deliver high-performance memory and storage products through our Micron and Crucial brands, fueling advancements in artificial intelligence and 5G applications. If you are motivated by the power of data and eager to contribute to cutting-edge solutions, we encourage you to explore career opportunities with us at micron.com/careers.,
Posted 2 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience: 8+ years Key Responsibilities: Collaborate with business teams to gather and understand requirements Lead hands-on design, development, and deployment of data pipelines and integration workflows Support testing, go-live, and hypercare phases Act as a mentor and guide to offshore Kafka developers; review code and ensure quality deliverables Take full ownership of assigned project deliverables Required Skills: Python | MS SQL | Java | Azure Databricks | Spark | Kenisis | Kafka | Sqoop | Hive | Apache NiFi | Unix Shell Scripting
Posted 2 weeks ago
0 years
10 - 18 Lacs
Hyderābād
On-site
Job Title: Integration Consultant – o9 Platform Locations: Bangalore | Mumbai | Gurugram | Pune | Kolkata | Hyderabad | Jaipur Mode: Onsite / Hybrid Contract: 12 Months Contract About the Role We are seeking a skilled and proactive Integration Consultant – o9 to join our implementation team. In this role, you will be responsible for end-to-end integration of partner systems into the o9 platform. You will leverage your technical expertise in SQL, Python, PySpark, and ETL tools to design and implement scalable, efficient data pipelines and batch processes that support key planning use cases across various industries. Key Responsibilities Act as an Integration Consultant on o9 implementation projects. Understand and work with the o9 platform’s data model , including table structures, linkages, and pipelines. Analyze customer data, assess data quality and integrity, and map technical and functional relationships. Participate in technical design and data requirement gathering sessions with clients and internal teams. Make recommendations for missing or inaccurate data and guide teams on optimal data integration strategies. Design and implement batch schedules (daily, weekly, quarterly, yearly) as per project requirements. Perform end-to-end integration implementation from external systems into the o9 platform. Collaborate closely with cross-functional teams, including solution consultants, data engineers, and client stakeholders. Technical Experience & Skills: Must-Have: Strong hands-on experience with SQL , PySpark , Python , Spark SQL , and ETL tools . Proficiency in databases like SQL Server , Oracle , etc. Familiarity with DDL/DML , stored procedures, and query optimization. Experience in at least one full-cycle o9 platform integration project . Good to Have: Experience with Apache Airflow , Delta Lake , Apache Nifi , or Kafka . Familiarity with API-based integrations and data exchange formats (JSON, XML, REST, etc.). Professional Attributes Strong problem-solving skills with a creative and analytical approach. Proven ability to work independently and collaboratively in a team-oriented environment . Excellent written and verbal communication and interpersonal skills. Ability to adapt in fast-paced, client-facing environments with tight deadlines. Educational Qualifications Bachelor’s or Master’s degree in Computer Science , Engineering , Information Technology , or a related field. Degrees such as BE , BTech , or MCA are preferred. Job Type: Full-time Pay: ₹90,000.00 - ₹150,000.00 per month Application Question(s): How many years of hands‑on experience do you have as an Integration Consultant working with the o9 platform? Have you specifically worked with the o9 platform before? How familiar are you with the o9 data model and pipelines? Do you have hands-on experience with SQL? How many years of experience do you have ETL Tool? Have you worked with PySpark and Spark SQL in data processing/integration? Do you have any experiences with DDL or DML scripts in previous roles? Have you worked with Airflow, Delta Lake, Apache NiFi, or Kafka? If yes, specify years and which tools? Are you an immediate joiner? Have you been part of at least one end-to-end o9 integration implementation? Work Location: In person Speak with the employer +91 7225016963
Posted 2 weeks ago
0 years
10 - 18 Lacs
Greater Kolkata Area
On-site
Integration Consultant – o9 Key Responsibilities Play the integration consultant role on o9 implementation projects. Understand o9 platform’s data model (table structures, linkages, pipelines, optimal designs) for designing various planning use cases. Review and analyze the data provided by customer along with its technical/functional intent and inter-dependencies. Participate in the technical design, data requirements gathering, making recommendations in case of inaccurate or missing data. Work on designing and creating batch schedules based on frequency and configuration settings for daily/weekly/quarterly/yearly batches. E2E integration implementation from partner system to o9 platform. Technical Experience Must have experience on SQL, PySpark, Python, Spark SQL and ETL tools. Proficiency in database (SQL Server, Oracle etc ). Knowledge of DDL, DML, stored procedures. At least one E2E o9 integration implementation experience is required. Good to have experience in Airflow, Dalta Lake, Nifi, Kafka. Any API based integration experience will be added advantage. Professional Attributes Proven ability to work creatively and analytically in a problem-solving environment. Proven ability to build, manage and foster a team-oriented environment. Excellent problem-solving skills with excellent communication written/oral, interpersonal skills. Strong collaborator- team player- and individual contributor. Educational Qualification BE/BTech/MCA/Bachelor's degree/master’s degree in computer science and related fields of work are preferred. Skills: airflow,sql server,spark sql,o9 platform,pyspark,e2e,api,kafka,integration,python,communication,case,data,sql,skills,computer science,etl tools,api based integration,oracle
Posted 2 weeks ago
0 years
10 - 18 Lacs
Pune, Maharashtra, India
On-site
Integration Consultant – o9 Key Responsibilities Play the integration consultant role on o9 implementation projects. Understand o9 platform’s data model (table structures, linkages, pipelines, optimal designs) for designing various planning use cases. Review and analyze the data provided by customer along with its technical/functional intent and inter-dependencies. Participate in the technical design, data requirements gathering, making recommendations in case of inaccurate or missing data. Work on designing and creating batch schedules based on frequency and configuration settings for daily/weekly/quarterly/yearly batches. E2E integration implementation from partner system to o9 platform. Technical Experience Must have experience on SQL, PySpark, Python, Spark SQL and ETL tools. Proficiency in database (SQL Server, Oracle etc ). Knowledge of DDL, DML, stored procedures. At least one E2E o9 integration implementation experience is required. Good to have experience in Airflow, Dalta Lake, Nifi, Kafka. Any API based integration experience will be added advantage. Professional Attributes Proven ability to work creatively and analytically in a problem-solving environment. Proven ability to build, manage and foster a team-oriented environment. Excellent problem-solving skills with excellent communication written/oral, interpersonal skills. Strong collaborator- team player- and individual contributor. Educational Qualification BE/BTech/MCA/Bachelor's degree/master’s degree in computer science and related fields of work are preferred. Skills: airflow,sql server,spark sql,o9 platform,pyspark,e2e,api,kafka,integration,python,communication,case,data,sql,skills,computer science,etl tools,api based integration,oracle
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France