Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role Description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments Job Title: SDET – Data Integration & Transformation Location: Pune, India Job Type: Full-time Experience Level: [Mid-Level / Senior] Department: Quality Engineering / Data Engineering Work Time: upto 8 PM IST ________________________________________ Mandatory: Java, Selenium, worked on testing and automating Data Engineering pipeline, Data Pipeline testing, quality and anomaly. Job Summary: We are seeking a highly skilled and detail-oriented SDET (Software Development Engineer in Test) with expertise in test automation for data integration and transformation processes. The ideal candidate will work closely with data engineers and developers to build robust, automated testing frameworks ensuring data quality, consistency, and integrity across complex ETL and integration pipelines. ________________________________________ Key Responsibilities: Design, develop, and maintain automated test frameworks for validating data integration and transformation workflows. Collaborate with data engineers to understand data flow, business rules, and transformation logic. Create and execute test cases for ETL processes, data pipelines, and APIs. Validate data quality, schema, completeness, and correctness across multiple data sources and targets. Automate regression, integration, and end-to-end testing for data-driven applications. Implement tests for data accuracy, consistency, duplication, and loss. Work closely with DevOps teams to integrate test automation into CI/CD pipelines. Participate in requirement analysis, risk assessment, and test planning activities. Document defects clearly and collaborate on root cause analysis and resolutions. ________________________________________ Required Skills & Experience: Strong experience with test automation in data integration and transformation environments. Solid understanding of ETL/ELT pipelines, data validation, and transformation logic. Proficiency in writing SQL queries for test validation and data profiling. Hands-on experience with Python, Java, or similar scripting languages for test automation. Familiarity with data integration tools (e.g., Apache NiFi, Talend, Informatica, etc.) is a plus. Understanding of data formats like JSON, XML, Avro, and Parquet. Experience with test frameworks such as PyTest, JUnit, TestNG, or similar. Knowledge of CI/CD tools like Jenkins, GitLab CI, or CircleCI. Familiarity with big data platforms and distributed systems (e.g., Kafka, Spark, Hadoop) is a plus. ________________________________________ Preferred Qualifications: Exposure to cloud data ecosystems (e.g., AWS Glue, Redshift, S3, EMR, GCP BigQuery). Experience with data cataloging and data lineage tools. Understanding of data governance and security compliance. Strong communication and collaboration skills with both technical and non-technical stakeholders. Mandatory Soft Skills:- Good written and verbal communication Strong sense of ownership and ability to drive tasks independently Proactive about raising blockers and suggesting solutions Able to collaborate effectively across backend, frontend, and DevOps teams Comfortable working in a fast-paced, asynchronous environment Skills Java,Selenium,Rest Api,Etl Testing
Posted 1 month ago
2.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Profile: Sr. DW BI Developer Location: Sector 64, Noida (Work from Office) Position Overview: Working with the Finance Systems Manager, the role will ensure that ERP system is available and fit for purpose. The ERP Systems Developer will be developing the ERP system, providing comprehensive day-to-day support, training and develop the current ERP System for the future. Key Responsibilities: As a Sr. DW BI Developer, the candidate will participate in the design / development / customization and maintenance of software applications. As a DW BI Developer, the person should analyse the different applications/Products, design and implement DW using best practices. Rich data governance experience, data security, data quality, provenance / lineage. The candidate will also be maintaining a close working relationship with the other application stakeholders. Experience of developing secured and high-performance web application(s) Knowledge of software development life-cycle methodologies e.g. Iterative, Waterfall, Agile, etc. Designing and architecting future releases of the platform. Participating in troubleshooting application issues. Jointly working with other teams and partners handling different aspects of the platform creation. Tracking advancements in software development technologies and applying them judiciously in the solution roadmap. Ensuring all quality controls and processes are adhered to. Planning the major and minor releases of the solution. Ensuring robust configuration management. Working closely with the Engineering Manager on different aspects of product lifecycle management. Demonstrate the ability to independently work in a fast-paced environment requiring multitasking and efficient time management. Required Skills and Qualifications: End to end Lifecyle of Data warehousing, DataLakes and reporting Experience with Maintaining/Managing Data warehouses. Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud). Strong SQL and analytical skills. Experience in Power BI, Tableau, Qlikview, Qliksense etc. Experience in Microsoft Azure Services. Experience in developing and supporting ADF pipelines. Experience in Azure SQL Server/ Databricks / Azure Analysis Services Experience in developing tabular model. Experience in working with APIs. Minimum 2 years of experience in a similar role Experience with data warehousing, data modelling. Strong experience in SQL 2-6 years of total experience in building DW/BI systems Experience with ETL and working with large-scale datasets. Proficiency in writing and debugging complex SQLs. Prior experience working with global clients. Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB. Experience with distributed data management, data sfailover,luding databases (Relational, NoSQL, Big data, data analysis, data processing, data transformation, high availability, and scalability) Experience in end-to-end project implementation in Cloud (Azure / AWS / GCP) as a DW BI Developer Rich data governance experience, data security, data quality, provenance / lineagHive, Impalaerstanding of industry trends and products in dataops , continuous intelligence , Augmented analytics , and AI/ML. Prior experience of working in cloud like Azure, AWS and GCP Prior experience of working with Global Clients To know our Privacy Policy, please click on the link below or copy paste the URL on your browser: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf
Posted 1 month ago
1.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Type: Full Time Experience: 1 Years to 2 Years Type: Virtual Hiring Last Date: 30-June-2025 Posted on: 18-June-2025 Education: BE/B.Tech,MCA,ME/M.Tech ADVERTISEMENT No. 02 Data Scientist/ AI Engineer / 2 Posts Age: 25 to 35 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional / Preferred Qualification: Certification in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 03 years of (post basic educational qualification) experience in related field, out of which: 2+ years experience with programming languages frequently used in data science (R/ Python). 2+ years Experience in model development, model validation or related field. 2+ years experience in data analytics. 2+ years experience in Relational Database or any NoSQL database including Graph databases Experience in cloud-based application/ service development. Experience in natural language processing, Web Crawling and Neural Networks. Experience in projects with Machine learning/ Artificial Intelligence technologies. Excellent communication skills and ability to work as part of a multicultural product development team. End-to-end experience from data extraction to modelling and its validation. Experience of working in a project environment as a developer. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL/ DML/ DQL). Desirable Qualities 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities: 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. 4. Analyze information to discover trends and patterns. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Data Scientist-cum-BI Developer /1 Post Age 23 to 30 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional/ preferred qualification: Certification/ Assignments/ Projects in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 01 year of (post basic educational qualification) working experience on assignments/ projects/ jobs related to ML/ AI. 2. Experience in projects with Machine learning/ Artificial Intelligence technologies. 3. Excellent communication skills and ability to work as part of a multicultural product development team. 4. End-to-end experience from data extraction to modelling and its validation. 5. Experience of working in a project environment as a developer. 6. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL / DML/ DQL). Desired Skill Set: 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. Analyze information to discover trends and patterns. 5. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Candidates can apply only ON-LINE on 16th June 2025 to 30 June 2025. Note: This is an aggregated job, sharing with a motive to intimate relevant opportunities with job seekers. Hireclap is not responsible / authorized for this recruitment process. Click Here For Job Details & Apply Online
Posted 1 month ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Description We are seeking a highly motivated ETL Data Engineer to join our dynamic data team. In this role, you will play a pivotal part in our data pipeline initiatives, where your expertise in ETL processes will be essential for transforming raw data into actionable insights. You will work closely with data analysts, data scientists, and other stakeholders to understand their data requirements and ensure that data is made accessible in a meaningful way. Your proficiency in designing and implementing robust ETL solutions will enable the organization to maintain high data quality and availability, facilitating key business decisions. About As an ETL Data Engineer, you will leverage your technical skills to develop data workflows, optimize data transformation processes, and troubleshoot data issues as they arise. You will also be responsible for ensuring compliance with data governance policies while utilizing best practices in data engineering. If you are passionate about data management and enjoy working in a fast-paced, collaborative environment, this opportunity is perfect for you to contribute significantly to our data initiatives and to grow your career within our : Design and develop ETL processes to facilitate data extraction, transformation, and loading from various sources. Collaborate with data analysts and business stakeholders to understand data requirements and translate them into technical specifications. Ensure data quality and integrity through monitoring and validation of ETL processes and workflows. Optimize performance of existing ETL workflows and data pipelines to improve efficiency and reduce processing time. Implement data governance practices to maintain compliance with industry regulations and internal policies. Maintain and support ETL tools and frameworks, ensuring systems are running smoothly and efficiently. Document data processes and standards, providing training and support to team members as : Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as an ETL Data Engineer or similar role in data engineering. Strong proficiency in ETL tools such as Apache NiFi, Talend, Informatica, or similar technologies. Experience with databases such as SQL Server, Oracle, MySQL, or PostgreSQL and knowledge of SQL scripting. Familiarity with cloud platforms like AWS, Azure, or Google Cloud for data warehousing solutions. Understanding of data modeling concepts and experience with data architecture. Ability to work collaboratively in a team environment and communicate effectively with both technical and non-technical stakeholders. (ref:hirist.tech)
Posted 1 month ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills:Technology->Java->Java - ALL,Technology->Java->Springboot - Angular -Apache Nifi -Springboot A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills
Posted 1 month ago
4.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Req ID: 324653 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Pune, Mahārāshtra (IN-MH), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 1 month ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Job Summary A career in our Managed Services team will provide you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Analytics and Insights Managed Services team bring a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Job Description To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. JD for ETL tester at Associate level As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Minimum Degree Required : Bachelor Degree Degree Preferred : Bachelors in Computer Engineering Minimum Years of Experience : 7 year(s) of IT experience Certifications Required : NA Certifications Preferred : Automation Specialist for TOSCA, Lambda Test Certifications Required Knowledge/Skills Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Knowledge/Skills Demonstrates extensive knowledge and/or a proven record of success in the following areas: Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
Posted 1 month ago
6.0 years
0 Lacs
Bhopal, Madhya Pradesh, India
On-site
Experience: Min 6+ Years Job Title: Data Engineer – Real-Time Streaming & Integration (Apache Kafka) Location: Bhopal, Madhya Pradesh On-site role with opportunities to work on enterprise-scale data platforms Note: Resource working on site will be provided with accommodation, lunch, and dinner by the client for the complete project duration. The working week is 6 days (Monday – Saturday). Role Overview: We are seeking a highly skilled and experienced Data Engineer with 6+ years of experience in designing and implementing real-time data processing pipelines and streaming integrations. This role is ideal for professionals with deep expertise in Apache Kafka, Kafka Connect, and modern ETL/ELT processes. As a Data Engineer, you will play a critical role in building and optimizing data integration frameworks to support large-scale, low-latency, and high-throughput data platforms across enterprise systems. Your contributions will directly impact data accessibility, business intelligence, and operational efficiency. Key Responsibilities: Design, develop, and maintain real-time streaming data pipelines using Apache Kafka and Kafka Connect. Implement and optimize ETL/ELT processes for structured and semi-structured data from various sources. Build and maintain scalable data ingestion, transformation, and enrichment frameworks across multiple environments. Collaborate with data architects, analysts, and application teams to deliver integrated data solutions that meet business requirements. Ensure high availability, fault tolerance, and performance tuning for streaming data infrastructure. Monitor, troubleshoot, and enhance Kafka clusters, connectors, and consumer applications. Enforce data governance, quality, and security standards throughout the pipeline lifecycle. Automate workflows using orchestration tools and CI/CD pipelines for deployment and version control. Required Skills & Qualifications: Strong hands-on experience with Apache Kafka, Kafka Connect, and Kafka Streams. Expertise in designing real-time data pipelines and stream processing architectures. Solid experience with ETL/ELT frameworks using tools like Apache NiFi, Talend, or custom Python/Scala-based solutions. Proficiency in at least one programming language: Python, Java, or Scala. Deep understanding of message serialization formats (e.g., Avro, Protobuf, JSON). Strong SQL skills and experience working with data lakes, warehouses, or relational databases. Familiarity with schema registry, data partitioning, and offset management in Kafka. Experience with Linux environments, containerization, and CI/CD best practices. Preferred Qualifications: Experience with cloud-native data platforms (e.g., AWS MSK, Azure Event Hubs, GCP Pub/Sub). Exposure to stream processing engines like Apache Flink or Spark Structured Streaming. Familiarity with data lake architectures, data mesh concepts, or real-time analytics platforms. Knowledge of DevOps tools like Docker, Kubernetes, Git, and Jenkins. Work Experience: 6+ years of experience in data engineering with a focus on streaming data and real-time integrations. Proven track record of implementing data pipelines in production-grade enterprise environments. Education Requirements: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. Certifications in data engineering, Kafka, Show more Show less
Posted 1 month ago
5.0 - 9.0 years
6 - 9 Lacs
Hyderābād
On-site
About the Role: Grade Level (for internal use): 10 Title: Senior Database Application Developer Team S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without being afraid of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global Xpressfeed product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of Java, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code Produce technical design documents and conduct technical walkthroughs Document and demonstrate solutions using technical design docs, diagrams and stubbed code Collaborate effectively with technical and non-technical stakeholders Respond to and resolve production issues. What we are looking for A minimum of 5 to 9 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, GIT, Autosys, Continuous Integration, unit-testing tool and defect management tools Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders Required technical skills: Excellent skills developing solutions involving relational database technologies on SQL Server and/or Oracle Platforms. Build data pipelines Utilize platforms like Snowflake, Databricks, GCP Fabric, Big Query, etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow Develop federated data services to provide scalable and performant data APIs, REST, Java, Scala Write infrastructure as code to develop sandbox environments Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed Desirable technical skills: React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases, nifi, Airflow, Spark, Hive, Spring Cloud, Spring Cloud Data Flow, Netty, Akka, Esper, Redis, Google protobuf, Google Guice, Google Guava, Spring Cloud. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316188 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India
Posted 1 month ago
5.0 - 9.0 years
3 - 7 Lacs
Gurgaon
On-site
About the Role: Grade Level (for internal use): 10 Title: Senior Database Application Developer Team S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without being afraid of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global Xpressfeed product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of Java, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code Produce technical design documents and conduct technical walkthroughs Document and demonstrate solutions using technical design docs, diagrams and stubbed code Collaborate effectively with technical and non-technical stakeholders Respond to and resolve production issues. What we are looking for A minimum of 5 to 9 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, GIT, Autosys, Continuous Integration, unit-testing tool and defect management tools Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders Required technical skills: Excellent skills developing solutions involving relational database technologies on SQL Server and/or Oracle Platforms. Build data pipelines Utilize platforms like Snowflake, Databricks, GCP Fabric, Big Query, etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow Develop federated data services to provide scalable and performant data APIs, REST, Java, Scala Write infrastructure as code to develop sandbox environments Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed Desirable technical skills: React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases, nifi, Airflow, Spark, Hive, Spring Cloud, Spring Cloud Data Flow, Netty, Akka, Esper, Redis, Google protobuf, Google Guice, Google Guava, Spring Cloud. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316188 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India
Posted 1 month ago
5.0 - 9.0 years
6 - 9 Lacs
Ahmedabad
On-site
About the Role: Grade Level (for internal use): 10 Title: Senior Database Application Developer Team S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without being afraid of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global Xpressfeed product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of Java, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code Produce technical design documents and conduct technical walkthroughs Document and demonstrate solutions using technical design docs, diagrams and stubbed code Collaborate effectively with technical and non-technical stakeholders Respond to and resolve production issues. What we are looking for A minimum of 5 to 9 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, GIT, Autosys, Continuous Integration, unit-testing tool and defect management tools Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders Required technical skills: Excellent skills developing solutions involving relational database technologies on SQL Server and/or Oracle Platforms. Build data pipelines Utilize platforms like Snowflake, Databricks, GCP Fabric, Big Query, etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow Develop federated data services to provide scalable and performant data APIs, REST, Java, Scala Write infrastructure as code to develop sandbox environments Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed Desirable technical skills: React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases, nifi, Airflow, Spark, Hive, Spring Cloud, Spring Cloud Data Flow, Netty, Akka, Esper, Redis, Google protobuf, Google Guice, Google Guava, Spring Cloud. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316188 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India
Posted 1 month ago
0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Data Modeller JD We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: oDesign and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration With Data Architect Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality And Governance Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling Tools: ERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. Databases: SQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing Technologies: Snowflake, Teradata, or similar. ETL Tools: Informatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data Technologies: Hadoop, Spark (optional but preferred). Technologies: Experience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory) Show more Show less
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 10 Title: Senior Database Application Developer Team S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without being afraid of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What You Can Expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global Xpressfeed product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of Java, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code Produce technical design documents and conduct technical walkthroughs Document and demonstrate solutions using technical design docs, diagrams and stubbed code Collaborate effectively with technical and non-technical stakeholders Respond to and resolve production issues. What We Are Looking For A minimum of 5 to 9 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, GIT, Autosys, Continuous Integration, unit-testing tool and defect management tools Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders Required Technical Skills Excellent skills developing solutions involving relational database technologies on SQL Server and/or Oracle Platforms. Build data pipelines Utilize platforms like Snowflake, Databricks, GCP Fabric, Big Query, etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow Develop federated data services to provide scalable and performant data APIs, REST, Java, Scala Write infrastructure as code to develop sandbox environments Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed Desirable Technical Skills React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases, nifi, Airflow, Spark, Hive, Spring Cloud, Spring Cloud Data Flow, Netty, Akka, Esper, Redis, Google protobuf, Google Guice, Google Guava, Spring Cloud. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316188 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 month ago
0.0 - 9.0 years
0 Lacs
Hyderabad, Telangana
On-site
About the Role: Grade Level (for internal use): 10 Title: Senior Database Application Developer Team S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without being afraid of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global Xpressfeed product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of Java, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code Produce technical design documents and conduct technical walkthroughs Document and demonstrate solutions using technical design docs, diagrams and stubbed code Collaborate effectively with technical and non-technical stakeholders Respond to and resolve production issues. What we are looking for A minimum of 5 to 9 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, GIT, Autosys, Continuous Integration, unit-testing tool and defect management tools Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders Required technical skills: Excellent skills developing solutions involving relational database technologies on SQL Server and/or Oracle Platforms. Build data pipelines Utilize platforms like Snowflake, Databricks, GCP Fabric, Big Query, etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow Develop federated data services to provide scalable and performant data APIs, REST, Java, Scala Write infrastructure as code to develop sandbox environments Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed Desirable technical skills: React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases, nifi, Airflow, Spark, Hive, Spring Cloud, Spring Cloud Data Flow, Netty, Akka, Esper, Redis, Google protobuf, Google Guice, Google Guava, Spring Cloud. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316188 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India
Posted 1 month ago
0.0 - 9.0 years
0 Lacs
Hyderabad, Telangana
On-site
Senior Database Application Developer Hyderabad, India; Ahmedabad, India; Gurgaon, India Information Technology 316188 Job Description About The Role: Grade Level (for internal use): 10 Title: Senior Database Application Developer Team S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without being afraid of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global Xpressfeed product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of Java, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code Produce technical design documents and conduct technical walkthroughs Document and demonstrate solutions using technical design docs, diagrams and stubbed code Collaborate effectively with technical and non-technical stakeholders Respond to and resolve production issues. What we are looking for A minimum of 5 to 9 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, GIT, Autosys, Continuous Integration, unit-testing tool and defect management tools Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders Required technical skills: Excellent skills developing solutions involving relational database technologies on SQL Server and/or Oracle Platforms. Build data pipelines Utilize platforms like Snowflake, Databricks, GCP Fabric, Big Query, etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow Develop federated data services to provide scalable and performant data APIs, REST, Java, Scala Write infrastructure as code to develop sandbox environments Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed Desirable technical skills: React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases, nifi, Airflow, Spark, Hive, Spring Cloud, Spring Cloud Data Flow, Netty, Akka, Esper, Redis, Google protobuf, Google Guice, Google Guava, Spring Cloud. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316188 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Data Engineer, Spark, Scala, Python, Onpremise, Cloudera, Snowflake, kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title : Senior Data Engineer Location : Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less
Posted 1 month ago
9.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Job Description : We are seeking a highly experienced and innovative Senior Data Engineer with a strong background in hybrid cloud data integration, pipeline orchestration, and AI-driven data modeling. This role is responsible for designing, building, and optimizing robust, scalable, and production-ready data pipelines across both AWS and Azure platforms, supporting modern data architectures such as CEDM and Data Vault 2.0. Responsibilities Design and develop hybrid ETL/ELT pipelines using AWS Glue and Azure Data Factory (ADF). Process files from AWS S3 and Azure Data Lake Gen2, including schema validation and data profiling. Implement event-based orchestration using AWS Step Functions and Apache Airflow (Astronomer). Develop and maintain bronze → silver → gold data layers using DBT or Coalesce. Create scalable ingestion workflows using Airbyte, AWS Transfer Family, and Rivery. Integrate with metadata and lineage tools like Unity Catalog and OpenMetadata. Build reusable components for schema enforcement, EDA, and alerting (e.g., MS Teams). Work closely with QA teams to integrate test automation and ensure data quality. Collaborate with cross-functional teams including data scientists and business stakeholders to align solutions with AI/ML use cases. Document architectures, pipelines, and workflows for internal stakeholders. Requirements Essential Skills: Job Experience with cloud platforms: AWS (Glue, Step Functions, Lambda, S3, CloudWatch, SNS, Transfer Family) and Azure (ADF, ADLS Gen2, Azure Functions,Event Grid). Skilled in transformation and ELT tools: Databricks (PySpark), DBT, Coalesce, and Python. Proficient in data ingestion using Airbyte, Rivery, SFTP/Excel files, and SQL Server extracts. Strong understanding of data modeling techniques including CEDM, Data Vault 2.0, and Dimensional Modeling. Hands-on experience with orchestration tools such as AWS Step Functions, Airflow (Astronomer), and ADF Triggers. Expertise in monitoring and logging with CloudWatch, AWS Glue Metrics, MS Teams Alerts, and Azure Data Explorer (ADX). Familiar with data governance and lineage tools: Unity Catalog, OpenMetadata, and schema drift detection. Proficient in version control and CI/CD using GitHub, Azure DevOps, CloudFormation, Terraform, and ARM templates. Experienced in data validation and exploratory data analysis with pandas profiling, AWS Glue Data Quality, and Great Expectations. Personal Excellent communication and interpersonal skills, with the ability to engage with teams. Strong problem-solving, decision-making, and conflict-resolution abilities. Proven ability to work independently and lead cross-functional teams. Ability to work in a fast-paced, dynamic environment and handle sensitive issues with discretion and professionalism. Ability to maintain confidentiality and handle sensitive information with attention to detail with discretion. The candidate must have strong work ethics and trustworthiness Must be highly collaborative and team oriented with commitment to excellence. Preferred Skills Job Proficiency in SQL and at least one programming language (e.g., Python, Scala). Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data and AI services. Knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica). Deep understanding of AI/Generative AI concepts and frameworks (e.g., TensorFlow, PyTorch, Hugging Face, OpenAI APIs). Experience with data modeling, data structures, and database design. Proficiency with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). Personal Demonstrate proactive thinking Should have strong interpersonal relations, expert business acumen and mentoring skills Have the ability to work under stringent deadlines and demanding client conditions Ability to work under pressure to achieve the multiple daily deadlines for client deliverables with a mature approach Other Relevant Information Bachelor’s in Engineering with specialization in Computer Science or Artificial Intelligence or Information Technology or a related field. 9+ years of experience in data engineering and data architecture. LeewayHertz is an equal opportunity employer and does not discriminate based on race, color, religion, sex, age, disability, national origin, sexual orientation, gender identity, or any other protected status. We encourage a diverse range of applicants. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
We’re seeking a talented and passionate Trainer to join our dynamic team in making a remarkable impact on the future of technology. The ideal candidate should have a strong base in technological concepts and a keen interest in delivery & mentoring. The role involves delivering best-in-class training sessions, supporting curriculum development, and providing hands-on guidance to learners. Responsibilities - What You’ll Do Training Coordination, Support & Delivery Assist in scheduling and coordinating training sessions Deliver classroom-based and virtual instructor-led training (ILT) sessions on various organizational products, platforms and technology Conduct hands-on training, workshops, and exercises to reinforce learning Manage training attendance records and assessments Learner Engagement Help ensuring access of relevant resources to learners Address learner queries by creating a positive learning environment Ensure smooth learning experience throughout the learning cycle Track learner’s progress through specific assessments and exercises Prepare learners for industry-standard certifications Curriculum Development Create structured learning paths for various experience levels Develop course materials, decks, and guides for training Update training content, available in various formats, based on industry trends and technological advancements, as and when applicable Prepare learners with practical applications of product offerings’ concepts Key Skills & Experience - What We’re Looking For Technical Skills Knowledge of any of the following technologies and industry advancements: Familiarity with GenAI Landscape, Machine Learning (ML), or a related area Proficiency in Data Engineering, Apache NiFi, Flow Files, Data Integration & Flow Management, ETL, and Data Warehousing concepts Knowledge of Python, SQL and other relevant programming languages Strong expertise in LCNC development (UI/UX Principles, Java, JavaScript frameworks) Experience with APIs and microservices Fundamental understanding of Web application development Training & Mentoring Skills Prior experience in conducting product-based or technology-based training sessions Ability to simplify complex technical concepts for easy understanding Must have delivery experience – both virtual and in-class trainings Excellent articulation, collaboration and mentoring skills Content Creation Experience in content creation and editing of training videos Qualifications & Experience Bachelor/Master’s degree in Computer Science, Engineering or a related field 5+ experience in cloud-based technologies or Artificial Intelligence (AI) Experience in training or coaching in a corporate or academic environment preferred Must have MS PowerPoint knowledge, Camtasia or other video editing skills Show more Show less
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Bhubaneshwar
On-site
Position: Data Migration Engineer (NV46FCT RM 3324) Required Qualifications: 4–6 years of experience in data migration, data integration, and ETL development. Hands-on experience with both relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases Experience in Google BigQuery for data ingestion, transformation, and performance optimization. Proficiency in SQL and scripting languages such as Python or Shell for custom ETL logic. Familiarity with ETL tools like Talend, Apache NiFi, Informatica, or AWS Glue. Experience working in cloud environments such as AWS, GCP, or Azure. Solid understanding of data modeling, schema design, and transformation best practices. Preferred Qualifications: Experience in BigQuery optimization, federated queries, and integration with external data sources. Exposure to data warehouses and lakes such as Redshift, Snowflake, or BigQuery. Experience with streaming data ingestion tools like Kafka, Debezium, or Google Dataflow. Familiarity with workflow orchestration tools such as Apache Airflow or DBT. Knowledge of data security, masking, encryption, and compliance requirements in migration scenarios. Soft Skills: Strong problem-solving and analytical mindset with high attention to data quality. Excellent communication and collaboration skills to work with engineering and client teams. Ability to handle complex migrations under tight deadlines with minimal supervision. ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: BhubaneshwarNoida Experience: 4-6 years Notice period: 0-30 days
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Req ID: 324638 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Req ID: 324631 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Req ID: 324632 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What makes this role special Join a green-field Enterprise solutions project that spans cloud-infra, data pipelines, QA automation, BI dashboards and business process analysis. Spend your first year rotating through four pods, discovering where you shine, then lock into the stream you love (DevOps, Data Engineering, QA, BI, or Business Analysis). Work side-by-side with senior architects and PMs; demo every Friday; leave with production-grade experience most freshers wait years to gain. Rotation roadmap (three months each) DevOps Starter – write Terraform variables, tweak Helm values, add a GitHub Action that auto-lints PRs. Data Wrangler – build a NiFi flow (CSV → S3 Parquet), add an Airflow DAG, validate schemas with Great Expectations. QA Automation – write PyTest cases for the WhatsApp bot, create a k6 load script, plug Allure reports into CI. BI / Business Analysis – design a Superset dataset & dashboard, document KPIs, shadow the PM to craft a user story and UAT sheet. Day-to-day you will Pick tickets from your pod’s board and push clean pull-requests or dashboard changes. Pair with mentors, record lessons in the wiki, and improve run-books as you go. Demo your work (max 15 min) in our hybrid Friday huddle. Must-have spark Basic coding in Python or JavaScript and Git fundamentals (clone → branch → PR). Comfortable with SQL JOINs & GROUP BY and spreadsheets for quick analysis. Curious mindset, clear written English, happy to ask “why?” and own deadlines. Bonus points A hobby Docker or AWS free-tier project. A Telegram/WhatsApp bot or hackathon win you can show. Contributions to open-source or a college IoT demo. What success looks like Ship at least twelve merged PRs/dashboards in your first quarter. Automate one manual chore the seniors used to dread. By month twelve you can independently take a user story from definition → code or spec → test → demo. Growth path Junior ➜ Associate II ➜ Senior (lead a pod); pay and AWS certifications climb with you. How to apply Fork github.com/company/erpnext-starter, fix any “good-first-issue”, open a PR. Email your resume, PR link, and a 150-word story about the coolest thing you’ve built. Short-listed candidates get a 30-min Zoom chat (no riddles) and a 24-hr mini-task aligned to your preferred first rotation. We hire attitude over pedigree—show you learn fast, document clearly, and love building, and you’re in. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company: Callerdesk.io Location: Noida, Uttar Pradesh | On-site Employment Type: Full-Time | Permanent Experience Required: 3+ Years Job Description We are looking for an experienced and proactive MySQL Database Administrator to join our IT services team. The ideal candidate will have strong hands-on experience with MySQL architecture, database performance tuning, replication, and high-availability setups. If you’re passionate about database optimization, ensuring uptime, and working closely with development teams, we’d love to hear from you. Minimum Requirements Educational Qualification: B.Tech / B.E. / MCA / M.Sc in Computer Science or a related field Experience: • Minimum 3 years in IT/Software Development, Web or Mobile App Projects • Minimum 3 years in MySQL Database Administration & Performance Tuning Responsibilities - Administer and maintain MySQL Server databases, including installation and configuration. - Monitor system health and performance, ensuring high availability and security - Perform real-time troubleshooting, diagnostics, and resolution of database issues - Recommend and implement database solutions to improve efficiency. - Automate recurring processes, maintain documentation, and track issues - Support developers with schema refinement, partitioning, and query tuning - Manage GTID replication, Master-Slave setups, and InnoDB clusters. - Set up and maintain DR (Disaster Recovery) and ProxySQL for high load management. - Work on physical backup/restoration and point-in-time recovery - Optimize SQL queries, triggers, events, stored procedures, and functions. - Experience with Linux OS, tools like MySQL Workbench, SQLyog, and database pipelines (Python / Apache NiFi preferred) Preferred Skills - Strong understanding of MySQL internal architecture - Hands-on experience in MySQL Enterprise Edition - Proficient in Linux server environments - Experience working on e-governance or large-scale IT projects is a plus. Role Details Role Title: MySQL Database Administrator Department: Engineering Software Industry: IT Services & Consulting Function: DBA/Data Warehousing Educational Background UG: B.Tech/B.E. in Any Specialization PG: MCA / M.Sc (Science) in Any Specialization Package: 3.5 lac to 6 Lakh Per annum Key Skills ‘MySQL DBA,’ ‘GTID Replication’, ‘Linux Server Administration’, ‘High Availability Setup’, ‘InnoDB Cluster’, ‘ProxySQL’, ‘Query Optimization’, ‘Database Backup & Recovery’, ‘Apache NiFi’, ‘MySQL Workbench’, ‘SQLyog’, ‘Stored Procedures’, ‘Triggers’, ‘E-Governance Projects’ Ready to join a high-impact team and take your MySQL expertise to the next level? Apply now , and let’s build robust, scalable systems together. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Streaming data Technical skills requirements :- Mandatory Skills- Spark, Scala, AWS, Hadoop (Big Data) Experience- 5+ Years Solid hands-on and Solution Architecting experience in Big-Data Technologies (AWS preferred) - Hands on experience in: AWS Dynamo DB, EKS, Kafka, Kinesis, Glue, EMR - Hands-on experience of programming language like Scala with Spark. - Good command and working experience on Hadoop Map Reduce, HDFS, Hive, HBase, and/or No-SQL Databases - Hands on working experience on any of the data engineering analytics platform (Hortonworks Cloudera MapR AWS), AWS preferred - Hands-on experience on Data Ingestion Apache Nifi, Apache Airflow, Sqoop, and Oozie - Hands on working experience of data processing at scale with event driven systems, message queues (Kafka FlinkSpark Streaming) - Hands on working Experience with AWS Services like EMR, Kinesis, S3, CloudFormation, Glue, API Gateway, Lake Foundation. - Hands on working Experience with AWS Athena - Data Warehouse exposure on Apache Nifi, Apache Airflow, Kylo - Operationalization of ML models on AWS (e.g. deployment, scheduling, model monitoring etc.) - Feature Engineering Data Processing to be used for Model development - Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) - Experience building data pipelines for structured unstructured, real-time batch, events synchronous asynchronous using MQ, Kafka, Steam processing - Hands-on working experience in analysing source system data and data flows, working with structured and unstructured data - Must be very strong in writing SQL queries - Strengthen the Data engineering team with Big Data solutions - Strong technical, analytical, and problem-solving skills Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France