Jobs
Interviews

10844 Apache Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.5 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About the Company KPMG in India is a leading professional services firm established in August 1993. The firm offers a wide range of services, including audit, tax, and advisory, to national and international clients across various sectors. KPMG operates from offices in 14 cities, including Mumbai, Bengaluru, Chennai, and Delhi. KPMG India is known for its rapid, performance-based, industry-focused, and technology-enabled services. The firm leverages its global network to provide informed and timely business advice, helping clients mitigate risks and seize opportunities. KPMG India is committed to quality and excellence, fostering a culture of growth, innovation, and collaboration. About the job: Spark/Scala Developer Experience: 5.5 to 9 years Location: Mumbai We are seeking a skilled Spark/Scala Developer with 5.5 - 9 years of experience in Big Data engineering. The ideal candidate will have strong expertise in Scala programming, SQL, and data processing using Apache Spark within Hadoop ecosystems. Key Responsibilities: Design, develop, and implement data ingestion and processing solutions for batch and streaming workloads using Scala and Apache Spark. Optimize and debug Spark jobs for performance and reliability. Translate functional requirements and user stories into scalable technical solutions. Develop and troubleshoot complex SQL queries to extract business-critical insights. Required Skills: 2+ years of hands-on experience in Scala programming and SQL. Proven experience with Hadoop Data Lake and Big Data tools. Strong understanding of Spark job optimization and performance tuning. Ability to work collaboratively in an Agile environment. Equal Opportunity Statement KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are hiring for one the IT product-based company Job Title: - Senior Data Engineer Exp-5+ years Location: - Gurgaon/Pune Work Mode: - Hybrid Skills: - Azure and Databricks Programming Language- Python, Powershell, .Net/Java are plus What you will do Participate in design and developed highly performing and scalble large-scale Data and Analytics products Participate in requirements grooming, analysis and design discussions with fellow developers, architects and product analysts Participate in product planning by providing estimates on user stories Participate in daily standup meeting and proactively provide status on tasks Develop high-quality code according to business and technical requirements as defined in user stories Write unit tests that will improve the quality of your code Review code for defects and validate implementation details against user stories Work with quality assurance analysts who build test cases that validate your work Demo your solutions to product owners and other stakeholders Work with other Data and Analytics development teams to maintain consistency across the products by following standards and best software development practices Provide third tier support for our product suite What you will bring 3+ years of Data Engineering and Analytics experience 2+ years of Azure and Databricks (or Apache Sparks, Hadoop and Hive) working experience Knowledge and application of the following technical skills: T-SQL/PL-SQL, PySpark, Azure Data Factory, Databricks (or Apache Sparks, Hadoop and Hive), and Power BI or equivalent Business Intelligence tools Understanding of dimension modeling and Data Warehouse concepts Programming skills such as Python, PowerShell, .Net/Java are plus Git repository experience and thorough understanding of branching and merging strategies. 2 years' experience developing in Agile Software Development Life Cycle and Scrum methodology Strong planning, and time management skills Advanced problem-solving skills and data driven Excellent written and oral communication skills Team player who fosters an environment of shared success, is passionate about always learning and improving, self-motivated, open minded, and creative What we would like to see Bachelor's degree in computer science or related field Healthcare knowledge is a plus

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Hi, We are hiring for Java developer 5+ years of experience as a Java developer with expertise in distributed systems and data processing pipelines Strong understanding of Google Cloud Dataflow, Apache Beam, Kafka, Splunk, and related technologies Proficiency in Java programming language and familiarity with relevant frameworks such as Spring Boot or Hibernate Experience working with big data platforms and solving large scale data processing challenges Looking for immediate joiner

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

` About the Company Candidate will be responsible for designing and developing applications for automating business processes, language understanding systems using NLP for text representation techniques more efficiently. About the Role The candidate will be required to conceive, design, develop NLP applications, develop application for document automation using python, pytorch and should be familiar with advanced libraries used in python and other programming languages. Responsibilities Develop and deploy applications like document automation, summarization, creation of user interface, query based chatbot. Understand business objectives and develop AI/ML models that help to achieve the same, along with metrics to track their progress. Qualifications Bachelor’s/master’s degree in computer science or engineering with a focus on language processing. At least 7+ years of experience with exposure to NLP and relevant projects. Required Skills Experience with AI/ML platforms, frameworks, and libraries. Knowledge in relevant programming languages, development tools, databases. Proficiency in programming in Python, Pytorch, tensorflow. Understanding of NLP techniques for text representation, semantic extraction techniques, data structures, and modeling. Capable of writing and building components to integrate into new or existing systems. Documentation experience for complex software components. Experience in implementing product lifecycle - design, development, quality, deployment, maintenance. Ready to work within a collaborative environment with teams. Creative thinking for identifying new opportunities. Preferred Skills Experience in projects which required working with natural language data such as nltk (Python), Apache OpenNLP or GATE. Knowledge of advanced desktop and web interface development, chatbot support interfaces etc.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. OVERALL PURPOSE OF THE ROLE : Reporting to the IS&T-Head of Engineering Operations, you will ensure compliance of solution deployed and overall Core Model operations in production. You will be responsible to support portfolio of applications in Engg solutions. Work with partner ecosystem, business stakeholders and IS&T stakeholders. This role is part of IS&T organization. RESPONSIBILITIES: Lead and manage run operations for portfolio of applications: Periodic weekly / monthly review of state of operations with key business stakeholders Provide expertise to support sites on new topics / new processes deployment Perform Rex (lessons learned) to ensure that Core Model is applied in deployed sites Ensure operations of all applications are as per SLAs Participate and lead any P1/P2 incidents pertaining to applications in production Working with operations team and suppliers implement actions to continuously improve performance of applications Guide, support and mentor the ServiceNow team, providing new opportunities to grow, get exposed to stakeholders and learn continuously Lead Transition to Run. Responsible for Cost management of solutions, reporting and continuous improvement. Coordinate with development, infrastructure, and business teams to manage application deployments, upgrades, and patches. Ensure compliance with security, audit, and regulatory requirements. Business connect and Stakeholders Management: Accountable along with Business & Process Owners to collect, filter and prioritize change requests Work with Business & Process Owners and Key User (KU) in the submission of their demands through Scope Control process / Ticket Management process Provide inputs to business in establishing business requirements, functional and non-functional specifications and advising the appropriate solutions. Provide recommendations for decisions-making at Functional Review Board and Change Control Board. Represent as service owner and decision approver Perform KU connect / business connect meetings. Accountable for customer satisfaction Strengthen the prioritization mechanism of requirements and related governance as part of release process Maintain strong and close relationships with key stakeholders including service owners, application owners, architects and Directors Managing partner ecosystem for effective service delivery Manage and drive partners for effective service delivery and operations for their respective application portfolio (Incident Management, Problem management, Service Request fulfilment and Change management) Monitor, track & Co-ordinate with AMS Partners on performance and availability of the applications. Monitoring and Tracking SLA’s/KPI’s- ensuring they are met by partners. Control quality of deliverables from External partners (Run Daily / Weekly governance meetings) Responsible for Core Model consistency: Ensure that core model (Solution + processes & rules) is well documented, evolves consistently and is not jeopardized by localizations. Responsible for design and run documents are updated frequently (co-ordinate with required stakeholders in getting the inputs) Promote the use of unique Core Model, cross-businesses across stakeholder ecosystem. Participate and provide inputs to rest of the IS (Information Systems) teams (Architects/ Projects) to maintain consistency for respective functional domain(s): Provide support on new topics from RUN perspectives (new processes, new business stakes) Participate / Support new projects concerning IS Landscape of functional domain Support the execution of the related strategy, in particular implementing the operational roadmap and right initiatives Support projects and initiatives, Communicate regularly about strategy, priorities and on-going projects to external and internal partners, and ensure constant awareness and alignment Competencies & Skills Strong stakeholder management skills to connect and engage with management team, key configuration management stakeholders, internal and external partners and suppliers Excellent verbal and written communication capabilities with the ability to interact and influence at all levels of this organization Able to formalize and present a synthetic view on complex issue sand concepts Strong analytical, problem-solving and critical thinking skills, and ability to find solutions (technical and functional) Coordination skills to lead and deliver run roadmap and improvement projects in parallel Strong organization skills with ability to meet tight deadlines and high challenges Result oriented and attention to details Ability to work with full autonomy and limited support Ability to work effectively in virtual, geographically dispersed and cross-cultural environments Maintain constant awareness and sponsorship of the IT leadership team through appropriate reporting about configuration and data management (priorities, blocking points requiring arbitration, strategy adaptations) Support the delivery of the Transformation roadmap, challenge it and bring adaptations if required. TECHNICAL COMPETENCIES & EXPERIENCE 6+ years of overall IT experience. Around 3+ years of 24x7 Production support experience. Language Skills: Knowledge and Hands on exp in Power Apps, RPA solutions, Java, Apache, PL/SQL, TOMCAT, ETL tools, Reporting, Experience in managing and supporting Design and Industrial applications, particularly Dassault Systèmes tools such as: CATIA, ENOVIA, DELMIA, 3DEXPERIENCE Platform & Other PLM/CAD/CAM/CAE tools is an added advantage. Experience with monitoring tools (e.g., Dynatrace), ticketing systems (e.g., ServiceNow, Jira), and cloud platforms (e.g., AWS, Azure). Driving performance and service quality for run activities for Engineering Applications Partner ecosystem management - Manage partner teams for Run activities compliance with Quality & Service Agreements Transition Management- Secure hand between build and run activities Expertise in driving process improvements initiatives ITIL certification - Good to have. Experience with other multinational companies and working in other geographies preferred. Location for the role? Travel? If yes, how much (%)- Bangalore. Travel very less. As a global business, we’re an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. We’re committed to creating an inclusive workplace for everyone.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Opentext - The Information Company OpenText is a global leader in information management, where innovation, creativity, and collaboration are the key components of our corporate culture. As a member of our team, you will have the opportunity to partner with the most highly regarded companies in the world, tackle complex issues, and contribute to projects that shape the future of digital transformation. AI-First. Future-Driven. Human-Centered. At OpenText, AI is at the heart of everything we do—powering innovation, transforming work, and empowering digital knowledge workers. We're hiring talent that AI can't replace to help us shape the future of information management. Join us. Your Impact An OpenText Content Server Consultant is responsible for the technical delivery of the xECM based solutions. Such delivery activities encompass development, testing, deployment and documentation of specific software components – either providing extensions to specific items of core product functionality or implementing specific system integration components. This role has a heavy deployment and administration emphasis. Engagements are usually long term, but some relatively short ones requiring only specific services like an upgrade or a migration also happen. The nature of work may include full application lifecycle activities right from development, deployment/provisioning, testing, migration, decommissioning and ongoing run & maintain (upgrades, patching etc.) support. The role is customer facing and requires excellent interpersonal skills with the ability to communicate to a wide range of stake holders (internally and externally), both verbally and in writing. What The Role Offers Work within an OpenText technical delivery team in order to: Participate and contribute to deployment activities. Participate in the day to day administration of the systems, including Incident & Problem Management Participate in planning and execution of new implementations, upgrades and patching activities. Participate in the advanced configuration of ECM software components, in line with project and customer time scales. Actively contribute in automating provisioning, patching and upgrade activities where possible to achieve operational efficiencies. Perform code reviews and periodic quality checks to ensure delivery quality is maintained. Prepare, maintain and submit activity/progress reports and time recording/management reports in accordance with published procedures. Keep project managers informed of activities and alert of any issues promptly. Provide inputs as part of engagement closure on project learnings and suggest improvements. Utilize exceptional written and verbal communication skills while supporting customers via web, telephone, or email, while demonstrating a high level of customer focus and empathy. Respond to and solve customer technical requests, show an understanding of the customer's managed hosted environment and applications within the Open Text enabling resolution of complex technical issues. Document or Implement proposed solutions. Respond to and troubleshoot alerts from monitoring of applications, servers and devices sufficient to meet service level agreements Collaborating on cross-team and cross-product technical issues with a variety of resources including Product support, IT, and Professional Services. What You Need To Succeed Well versed with deployment, administration and troubleshooting of the OpenText xECM platform and surrounding components (Content Server, Archive Center, Brava, OTDS, Search & Indexing) and integrations with SAP, SuccessFactors, Salesforce. Good Experience/knowledge On Following Experience working in an ITIL aligned service delivery organisation. Knowledge of Windows, UNIX, and Application administration skills in a TCP/IP networked environment. Experience working with relational DBMS (PostgreSQL/Postgres, Oracle, MS SQL Server, mySQL). Independently construct moderate complexity SQL’s without guidance. Programming/scripting is highly desirable, (ie. Oscript, Java, JavaScript, PowerShell, Bash etc.) Familiarity with configuration and management of web/application servers (IIS, Apache, Tomcat, JBoss, etc.). Good understanding of object-oriented programming, Web Services, LDAP configuration. Experience in installing and configuring xECM in HA and knowledge in DR setup/drill. Experince in patching, major upgrades and data migration activities. Candidate Should Possess Team player Customer Focus and Alertness Attention to detail Always learning Critical Thinking Highly motivated Good Written and Oral Communication Knowledge sharing, blogs OpenText's efforts to build an inclusive work environment go beyond simply complying with applicable laws. Our Employment Equity and Diversity Policy provides direction on maintaining a working environment that is inclusive of everyone, regardless of culture, national origin, race, color, gender, gender identification, sexual orientation, family status, age, veteran status, disability, religion, or other basis protected by applicable laws. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please contact us at hr@opentext.com. Our proactive approach fosters collaboration, innovation, and personal growth, enriching OpenText's vibrant workplace.

Posted 1 week ago

Apply

7.0 years

0 Lacs

India

On-site

Required Skills & Experience: Minimum 7+ years of professional experience in ColdFusion development (CFML). Proven expertise with ColdFusion versions 10, 11, 2016, 2018, or newer. Strong understanding of MVC frameworks (e.g., FW/1, ColdBox) and object-oriented programming (OOP) principles in ColdFusion. Proficiency in SQL and experience working with relational databases (e.g., MS SQL Server, MySQL). Ability to write complex queries, stored procedures, and optimize database performance. Experience with front-end technologies: HTML, CSS, JavaScript, jQuery. Familiarity with web servers (IIS, Apache) and their integration with ColdFusion. Experience with version control systems (e.g., Git). Strong analytical and problem-solving skills with an ability to diagnose and resolve complex technical issues quickly and effectively. Excellent communication skills, both written and verbal, with the ability to articulate technical concepts clearly to non-technical stakeholders. Ability to work independently with minimal supervision and as part of a distributed team. Proactive attitude with a strong sense of ownership and responsibility for application stability. Job Description We are seeking a highly skilled and experienced ColdFusion Developer to join our team on a contract basis. The successful candidate will be instrumental in enhancing, maintaining, troubleshooting our customer’s critical ColdFusion applications, ensuring their stability and performance during our strategic migration. This role requires a strong understanding of ColdFusion best practices, excellent problem-solving skills, and the ability to work collaboratively with customer’s internal development and operations teams. Key Responsibilities & Duties: Enhancements & Refinements: Implement enhancements and feature requests for ColdFusion applications as required. Optimize existing ColdFusion code for efficiency, scalability, and security. Participate in code reviews to ensure quality and adherence to established standards. Application Maintenance & Support: Perform regular maintenance, bug fixes, and performance tuning on existing ColdFusion applications. Monitor application health, identify potential issues, and implement proactive solutions to prevent downtime. Respond to and resolve production incidents and user-reported issues in a timely and efficient manner. Collaborate with internal support teams to diagnose and resolve complex technical problems. Documentation: Create and update technical documentation for ColdFusion applications, including system architecture, configurations, and troubleshooting guides. Document solutions to recurring issues and best practices for future reference. Collaboration & Communication: Work closely with existing development teams (including PHP developers) to understand application dependencies and ensure smooth operations during the transition phase. Communicate effectively with project managers, stakeholders, and other team members regarding progress, challenges, and solutions. o Provide technical guidance and knowledge transfer to internal teams as needed, particularly regarding the intricacies of the ColdFusion codebase. Database Interaction: Develop and optimize complex SQL queries for various database systems (e.g., MS SQL Server, MySQL) used by ColdFusion applications. Ensure data integrity and performance of database interactions. Security: Adhere to security best practices and implement necessary measures to protect sensitive data within ColdFusion applications. Other nice to have skills: Experience with PHP or other modern web technologies (Node.js, Python, Java) would be a plus, demonstrating an understanding of different development paradigms. Familiarity with AWS or Azure cloud environments. Experience in the FinTech or Healthcare sector. Knowledge of Direct Debit systems or recurring payment platforms.

Posted 1 week ago

Apply

4.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Mindera At Mindera , we craft software with people we love. We're a collaborative, global team of engineers who value open communication, great code, and building impactful products. We're looking for a talented C#/.NET Developer to join our growing team in Gurugram and help us build scalable, high-quality software systems. Requirements What You'll Do Build, maintain, and scale robust C#/.NET applications in a fast-paced Agile environment. Work closely with product owners and designers to bring features to life. Write clean, maintainable code following SOLID and OOP principles. Work with SQL/NoSQL databases, optimizing queries and schema designs. Collaborate in a Scrum or Kanban environment with engineers around the world. Use Git for version control and participate in code reviews. Contribute to our CI/CD pipelines and automated testing workflows. Must-Have Skills What We're Looking For 4-9 years of hands-on experience with C# and .NET technologies. Solid understanding of Object-Oriented Programming (OOP) and clean code principles. Proven experience working with databases (SQL or NoSQL). Experience in an Agile team (Scrum/Kanban). Familiarity with Git and collaborative development practices. Exposure to CI/CD pipelines and test automation. Nice-to-Have Skills Experience with Rust (even hobbyist experience is valued). Background working with Python or Scala for Spark-based applications. Hands-on with Docker and container-based architecture. Familiarity with Kubernetes for orchestration. Experience working with Apache Airflow for data workflows. Cloud experience with Google Cloud Platform (GCP) or Microsoft Azure. Benefits We Offer Flexible working hours (self-managed) Competitive salary Annual bonus, subject to company performance Access to Udemy online training and opportunities to learn and grow within the role About Mindera At Mindera we use technology to build products we are proud of, with people we love. Software Engineering Applications, including Web and Mobile, are at the core of what we do at Mindera. We partner with our clients, to understand their product and deliver high performance, resilient and scalable software systems that create an impact in their users and businesses across the world. You get to work with a bunch of great people, where the whole team owns the project together. Our culture reflects our lean and self-organisation attitude. We encourage our colleagues to take risks, make decisions, work in a collaborative way and talk to everyone to enhance communication. We are proud of our work and we love to learn all and everything while navigating through an Agile, Lean and collaborative environment. Follow our Linkedln page - https://tinyurl.com/minderaindia Check ot our Blog: http://mindera.com/ and our Handbook: http://bit.ly/MinderaHandbook Our offices are located: Aveiro, Portugal | Porto, Portugal | Leicester, UK | San Diego, USA | San Francisco, USA | Chennai, India | Bengaluru, India

Posted 1 week ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Lead and mentor a team of Linux system administrators, assigning tasks and monitoring performance. Design, deploy, maintain, and optimize Linux-based infrastructure (RedHat, CentOS, Oracle Linux, Ubuntu). Manage critical services such as Apache, Nginx, MySQL/MariaDB, etc. Configure and maintain monitoring tools (e.g., Nagios, Zabbix, Prometheus, Grafana). Implement and enforce security practices: patching, hardening, firewalls (iptables/nftables), SELinux. Oversee backup and disaster recovery processes. Plan and execute migrations, upgrades, and performance tuning. Collaborate with cross-functional teams (Network, DevOps, Development) to support infrastructure needs. Define and document policies, procedures, and best practices. Respond to incidents and lead root cause analysis for system outages or degradations. Maintain uptime and SLAs for production environments. Experience with virtualization (KVM, VMware, Proxmox) and cloud platforms (AWS, GCP, Azure, or private cloud). Solid understanding of TCP/IP, DNS, DHCP, VPN, and other network services Hands-on experience working on Firewall like Sophos, fortigate Strong problem-solving and incident management skills. Skills: cloud,azure,selinux,firewall management,proxmox,apache,cloud platforms,oracle linux,dns administration,nginx,linux administration,infrastructure,hardening,incident management,aws,backup and disaster recovery,security practices,dhcp,mariadb,virtualization,grafana,disaster recovery,ubuntu,centos,patching,vmware,dns,kvm,prometheous,linux,redhat,gcp,nagios,prometheus,mysql,problem solving,firewalls,zabbix,vpn,tcp/ip

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Title: Data Engineer – GCP Location: On-Site (Hyderabad / Bangalore) Experience: 5+ Years Employment Type: Full-Time Job Overview: We are seeking a highly skilled and motivated Data Engineer with a strong background in Google Cloud Platform (GCP) and data processing frameworks. The ideal candidate will have hands-on experience in building and optimizing data pipelines, architectures, and data sets using GCP services like BigQuery, Dataflow, Pub/Sub, GCS, and Cloud Composer. Key Responsibilities: Design, build, and maintain scalable and efficient data pipelines on GCP. Implement data ingestion, transformation, and orchestration using GCP services: BigQuery, Dataflow, Pub/Sub, GCS, and Cloud Composer . Write complex and optimized SQL queries for data transformation and analytics. Develop Python scripts for custom transformations and pipeline logic. Orchestrate workflows using Apache Airflow (Cloud Composer). Collaborate with DevOps to implement CI/CD pipelines using Jenkins , GitLab , and Terraform . Ensure data quality, governance, and reconciliation across systems. Required Skills: GCP Expertise : BigQuery, Dataflow, Pub/Sub, GCS, Cloud Composer Languages : Advanced SQL, Python (strong scripting and data transformation experience) DevOps & IaC : Basic experience with Terraform, Jenkins, and GitLab Data Orchestration : Strong experience with Apache Airflow Nice-to-Have Skills: Containerization & Cluster Management : GKE (Google Kubernetes Engine) Big Data Ecosystem : Bigtable, Kafka, Hadoop CDC/Data Sync : Oracle GoldenGate Distributed Processing : PySpark Data Auditing : Data Reconciliation frameworks or strategies

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About The Role Grade Level (for internal use): 10 The Team: Team is responsible for the development of tools to collect data from various sources. It is the backbone of all the data being presented to clients. The team is responsible for modernizing and migrating the internal platform utilizing latest technologies. The Impact: As a Software Developer III, you will be part of development team that manages multi-terabyte data using latest web, cloud and big data technologies. You will be part of a heavy data intensive environment. What’s in it for you: It’s a fast-paced agile environment that deals with huge volumes of data, so you’ll have an opportunity to sharpen your data skills and work on emerging technology stack. Responsibilities Design, and implement software components for data processing systems. Perform analysis and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What We’re Looking For : B.S. in Computer Science or equivalent 4+ years of relevant experience Expert in OOPs, .Net and C# concepts Expert in server side programming using ASP.Net Or Python Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP) Experience with Big Data platforms such as Apache Spark Proficient with software development lifecycle (SDLC) methodologies like Scaled Agile, Test-driven development Good experience with developing solutions involving relational database technologies on SQL . Server platform, stored procedure programming experience using Transact SQL. Passionate, smart, and articulate developer Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Preferred Qualifications Experience working in cloud computing environments such as AWS Experience with large-scale messaging systems What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 318313 Posted On: 2025-07-28 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Scope: Systems Engineer responsibilities include deploying product updates, identifying production issues and implementing integrations that meet customer needs. The position works closely with internal teams and is responsible for configuration and troubleshooting of Karix products. Candidates should be familiar with Ruby or Python, Ultimately, you will execute and automate operational processes fast, accurately, and securely. Job Location: Hyderabad What You'd Do? Implement integrations requested by Internal teams / customers Provide Level 2 technical support and queries coming from Internal teams Build tools to reduce occurrences of errors and improve customer experience Perform root cause analysis for production errors Investigate and resolve technical issues, Deploy updates and fixes Develop scripts to automate visualization Design procedures for system troubleshooting and maintenance 24*7 support, should be flexible for shifts What You'd Have? Should have basic knowledge on ITIL Strong hands-on Linux Administration & troubleshooting, Scripting, VMware, Cron scheduling for backups, LVM, Basics on Java, C++, Ansible, Nginx, Apache, Tomcat Networking fundamentals (TCP/IP, LAN, WAN, VPN, routing) Basic knowledge on UAT-DC-DR environment, VAPT, SCD, SCR, SAN Storage Basic Knowledge on MySQL, Oracle, MongoDB, Redis, Backup & restore, Commvault Any other technical exposure to different technologies will be an advantage Installation of packages, patch management and upgrades Basic Understanding of Languages (Shell scripting, Python, Perl) Knowledge of best practices and IT operations in an always-up, always-available service.

Posted 1 week ago

Apply

3.0 years

15 - 20 Lacs

Madurai, Tamil Nadu

On-site

Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Engineer Location: Madurai Experience: 5+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Engineer, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 5+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person

Posted 1 week ago

Apply

9.0 - 15.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title- Snowflake Data Architect Experience- 9 to 15 Years Location- Gurugram Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures. Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion, or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake. Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema, normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake: Architecture design, performance tuning, cost optimization. Strong proficiency in SQL, Python, and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion, or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms: AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks. Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI, or Looker. Certifications (Preferred/Required): ✅ Snowflake SnowPro Core Certification – Required or Highly Preferred ✅ SnowPro Advanced Architect Certification – Preferred ✅ Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ✅ ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title : Cloud Data Engineer | Database Administrator | ETL & Power BI | DevOps Enthusiast Job Location : Hyderabad /Chennai Job Type : Full Time Experience : 6+ Yrs Notice Period - Immediate to 15 days joiners are highly preferred About the Role: We are seeking a Cloud Data Engineer & Database Administrator to join our Cloud Engineering team and support our cloud-based data infrastructure. This role focuses on optimizing database operations, enabling analytics/reporting tools, and driving automation initiatives to improve scalability, reliability, and cost efficiency across the data platform. Key Responsibilities: Manage and administer cloud-native databases, including Azure SQL, PostgreSQL Flexible Server, Cosmos DB (vCore), and MongoDB Atlas . Automate database maintenance tasks (e.g., backups, performance tuning, auditing, and cost optimization). Implement and monitor data archival and retention policies to enhance query performance and reduce costs. Build and maintain Jenkins pipelines and Azure Automation jobs for database and data platform operations. Design, develop, and maintain dashboards for cost tracking, performance monitoring, and usage analytics (Power BI/Tableau). Enable and manage authentication and access controls (Azure AD, MFA, RBAC). Collaborate with cross-functional teams to support workflows in Databricks, Power BI, and other data tools . Write and maintain technical documentation and standard operating procedures (SOPs) for data platform operations. Work with internal and external teams to ensure alignment of deliverables and data platform standards. Preferred Qualifications: Proven experience with cloud platforms (Azure preferred; AWS or GCP acceptable). Strong hands-on expertise with relational and NoSQL databases . Experience with Power BI (DAX, data modeling, performance tuning, and troubleshooting). Familiarity with CI/CD tools (Jenkins, Azure Automation) and version control (Git). Strong scripting knowledge ( Python, Bash, PowerShell ) and experience with Jira, Confluence, and ServiceNow . Understanding of cloud cost optimization and billing/usage tracking. Experience implementing RBAC, encryption, and security best practices . Excellent problem-solving skills, communication, and cross-team collaboration abilities. Nice to Have: Hands-on experience with Databricks, Apache Spark, or Lakehouse architecture . Familiarity with logging, monitoring, and incident response for data platforms. Understanding of Kubernetes, Docker, Terraform , and advanced CI/CD pipelines. Required Skills: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent professional experience). 6+ years of professional experience in data engineering or database administration. 3+ years of database administration experience in Linux and cloud/enterprise environments. About the Company: Everest DX – We are a Digital Platform Services company, headquartered in Stamford. Our Platform/Solution includes Orchestration, Intelligent operations with BOTs’, AI-powered analytics for Enterprise IT. Our vision is to enable Digital Transformation for enterprises to deliver seamless customer experience, business efficiency and actionable insights through an integrated set of futuristic digital technologies. Digital Transformation Services - Specialized in Design, Build, Develop, Integrate, and Manage cloud solutions and modernize Data centers, build a Cloud-native application and migrate existing applications into secure, multi-cloud environments to support digital transformation. Our Digital Platform Services enable organizations to reduce IT resource requirements and improve productivity, in addition to lowering costs and speeding digital transformation. Digital Platform - Cloud Intelligent Management (CiM) - An Autonomous Hybrid Cloud Management Platform that works across multi-cloud environments. helps enterprise Digital Transformation get most out of the cloud strategy while reducing Cost, Risk and Speed. To know more please visit: http://www.everestdx.com

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

On-site

About the Role: Grade Level (for internal use): 10 The Team : Team is responsible for the development of tools to collect data from various sources. It is the backbone of all the data being presented to clients. The team is responsible for modernizing and migrating the internal platform utilizing latest technologies. The Impact : As a Software Developer III, you will be part of development team that manages multi-terabyte data using latest web, cloud and big data technologies. You will be part of a heavy data intensive environment. What’s in it for you : It’s a fast-paced agile environment that deals with huge volumes of data, so you’ll have an opportunity to sharpen your data skills and work on emerging technology stack. Responsibilities : Design, and implement software components for data processing systems. Perform analysis and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What We’re Looking For : Basic Qualifications : B.S. in Computer Science or equivalent 4+ years of relevant experience Expert in OOPs, .Net and C# concepts Expert in server side programming using ASP.Net Or Python Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP) Experience with Big Data platforms such as Apache Spark Proficient with software development lifecycle (SDLC) methodologies like Scaled Agile, Test-driven development Good experience with developing solutions involving relational database technologies on SQL . Server platform, stored procedure programming experience using Transact SQL. Passionate, smart, and articulate developer Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Preferred Qualifications : Experience working in cloud computing environments such as AWS Experience with large-scale messaging systems What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 318313 Posted On: 2025-07-28 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

On-site

Senior Software Developer Hyderabad, India; Hyderabad, India Information Technology 318313 Job Description About The Role: Grade Level (for internal use): 10 The Team : Team is responsible for the development of tools to collect data from various sources. It is the backbone of all the data being presented to clients. The team is responsible for modernizing and migrating the internal platform utilizing latest technologies. The Impact : As a Software Developer III, you will be part of development team that manages multi-terabyte data using latest web, cloud and big data technologies. You will be part of a heavy data intensive environment. What’s in it for you : It’s a fast-paced agile environment that deals with huge volumes of data, so you’ll have an opportunity to sharpen your data skills and work on emerging technology stack. Responsibilities : Design, and implement software components for data processing systems. Perform analysis and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What We’re Looking For : Basic Qualifications : B.S. in Computer Science or equivalent 4+ years of relevant experience Expert in OOPs, .Net and C# concepts Expert in server side programming using ASP.Net Or Python Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP) Experience with Big Data platforms such as Apache Spark Proficient with software development lifecycle (SDLC) methodologies like Scaled Agile, Test-driven development Good experience with developing solutions involving relational database technologies on SQL . Server platform, stored procedure programming experience using Transact SQL. Passionate, smart, and articulate developer Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Preferred Qualifications : Experience working in cloud computing environments such as AWS Experience with large-scale messaging systems What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 318313 Posted On: 2025-07-28 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

Remote

Req ID: 335295 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake Engineer - Digital Solution Consultant Sr. Analyst to join our team in Hyderabad, Telangana (IN-TG), India (IN). Experience with other cloud data warehousing solutions. Knowledge of big data technologies (e.g., Apache Spark, Hadoop). Experience with CI/CD pipelines and DevOps practices. Familiarity with data visualization tools. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, https://us.nttdata.com/en/contact-us. NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

Remote

Software Engineer II Hyderabad, Telangana, India Date posted Jul 28, 2025 Job number 1851616 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview The Purview team is dedicated to protecting and governing the enterprise digital estate on a global scale. Our mission involves developing cloud solutions that offer premium features such as security, compliance, data governance, data loss prevention and insider risk management. These solutions are fully integrated across Office 365 services and clients, as well as Windows. We create global-scale services to transport, store, secure, and manage some of the most sensitive data on the planet, leveraging Azure, Exchange, and other cloud platforms, along with Office applications like Outlook. The IDC arm of our team is expanding significantly and seeks talented, highly motivated engineers. This is an excellent opportunity for those looking to build expertise in cloud distributed systems, security, and compliance. Our team will develop cloud solutions that meet the demands of a vast user base, utilizing state-of-the-art technologies to deliver comprehensive protection. Office 365, the industry leader in hosted productivity suites, is the fastest-growing business at Microsoft, with over 100 million seats hosted in multiple data centers worldwide. The Purview Engineering team provides leadership, direction, and accountability for application architecture, cloud design, infrastructure development, and end-to-end implementation. You will independently determine and develop architectural approaches and infrastructure solutions, conduct business reviews, and operate our production services. Strong collaboration skills are essential to work closely with other engineering teams, ensuring our services and systems are highly stable, performant, and meet the expectations of both internal and external customers and users. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Qualifications - Required: Solid understanding of Object-Oriented Programming (OOP) and common Design Patterns. Minimum of 4+ years of software development experience, with proficiency in C#, Java, or scala. Hands-on experience with cloud platforms such as Azure, AWS, or Google Cloud; experience with Azure Services is a plus. Familiarity with DevOps practices, CI/CD pipelines, and agile methodologies. Strong skills in distributed systems and data processing. Excellent communication and collaboration abilities, with the capacity to handle ambiguity and prioritize effectively. A BS or MS degree in Computer Science or Engineering, or equivalent work experience. Qualifications - Other Requirements: Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft background and Microsoft Cloud background check upon hire/transfer and every two years thereafter. Responsibilities Build cloud-scale services that process and analyze massive volumes of organizational signals in real time. Harness the power of Apache Spark for high-performance data processing and scalable pipelines. machine learning to uncover subtle patterns and anomalies that signal insider threats. Craft intelligent user experiences using React and AI-driven insights to help security analysts act with confidence. Work with a modern tech stack and contribute to a product that’s mission-critical for some of the world’s largest organizations. Collaborate across disciplines—from data science to UX to cloud infrastructure—in a fast-paced, high-impact environment. Design and deliver end-to-end features including system architecture, coding, deployment, scalability, performance, and quality. Develop large-scale distributed software services and solutions that are modular, secure, reliable, diagnosable, and reusable. Conduct investigations and drive investments in complex technical areas to improve systems and services. Ensure engineering excellence by writing effective code, unit tests, debugging, code reviews, and building CI/CD pipelines. Troubleshoot and optimize Live Site operations, focusing on automation, reliability, and monitoring. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

0.0 - 4.0 years

0 Lacs

Karnataka

On-site

Bengaluru, Karnataka, India Sub Business Unit Engineer Job posted on Jul 28, 2025 Employee Type Permanent Experience range (Years) 2 years - 4 years Core Qualifications Experience: 2-4 years of professional experience in the full software development lifecycle. Python Proficiency: Solid, hands-on coding experience in Python. Computer Science Fundamentals: A deep and practical understanding of Data Structures, Algorithms, and Software Design Patterns. Version Control: Proficiency with Git or other distributed version control systems. Analytical Mindset: Excellent analytical, debugging, and problem-solving abilities. Java Knowledge: Basic understanding of Java concepts and syntax. Testing : Familiarity with testing tools and a commitment to Test-Driven Development (TDD) principles. Education: Bachelor's degree in Computer Science, Computer Engineering, or a related technical field. Good to have Generative AI : Experience with frameworks like LangChain, Google ADK/Autogen, or direct experience with APIs from OpenAI, Google (Gemini), or other LLM providers. Web Stacks : Strong knowledge of Python web frameworks like Flask or FastAPI, Celery etc. Data Engineering : Hands-on experience with NumPy, Pandas/Polars, data pipeline tools (e.g., Apache Spark, Kafka), and visualization. Databases : Proficiency with both SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB, Elasticsearch, Redis) databases. DevOps & Cloud : Experience with AWS (including EC2, Lambda, EKS/ECS), Docker, and CI/CD best practices and pipelines (e.g., GitLab CI). Operating Systems : Good working knowledge of programming in a UNIX/Linux environment. FinTech Domain : Prior experience or interest in the financial technology sector is a plus. Reporting to Technical Lead

Posted 1 week ago

Apply

0.0 - 15.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Description Lead the design, development, and implementation of scalable data pipelines and ELT processes using Databricks, DLT, dbt, Airflow, and other tools. Collaborate with stakeholders to understand data requirements and deliver high-quality data solutions. Optimize and maintain existing data pipelines to ensure data quality, reliability, and performance. Develop and enforce data engineering best practices, including coding standards, testing, and documentation. Mentor junior data engineers, providing technical leadership and fostering a culture of continuous learning and improvement. Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to business operations. Stay up to date with the latest industry trends and technologies, and proactively recommend improvements to our data engineering practices. Qualifications Systems (MIS), Data Science or related field. 15 years of experience in data engineering and/or architecture, with a focus on big data technologies. Extensive production experience with Databricks, Apache Spark, and other related technologies. Familiarity with orchestration and ELT tools like Airflow, dbt, etc. Expert SQL knowledge. Proficiency in programming languages such as Python, Scala, or Java. Strong understanding of data warehousing concepts. Experience with cloud platforms such as Azure, AWS, Google Cloud. Excellent problem-solving skills and the ability to work in a fast-paced, collaborative environment. Strong communication and leadership skills, with the ability to effectively mentor and guide Experience with machine learning and data science workflows Knowledge of data governance and security best practices Certification in Databricks, Azure, Google Cloud or related technologies. Job Engineering Primary Location India-Karnataka-Bengaluru Schedule: Full-time Travel: No Req ID: 252684 Job Hire Type Experienced Not Applicable #BMI N/A

Posted 1 week ago

Apply

0.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Location Bengaluru, Karnataka, India Job ID R-232528 Date posted 28/07/2025 Job Title: Analyst – Data Engineer Introduction to role: Are you ready to make a difference in the world of data science and advanced analytics? As a Data Engineer within the Commercial Strategic Data Management team, you'll play a pivotal role in transforming data science solutions for the Rare Disease Unit. Your mission will be to craft, develop, and deploy data science solutions that have a real impact on patients' lives. By leveraging cutting-edge tools and technology, you'll enhance delivery performance and data engineering capabilities, creating a seamless platform for the Data Science team and driving business growth. Collaborate closely with the Data Science and Advanced Analytics team, US Commercial leadership, Sales Field Team, and Field Operations to build data science capabilities that meet commercial needs. Are you ready to take on this exciting challenge? Accountabilities: Collaborate with the Commercial Multi-functional team to find opportunities for using internal and external data to enhance business solutions. Work closely with business and advanced data science teams on cross-functional projects, delivering complex data science solutions that contribute to the Commercial Organization. Manage platforms and processes for complex projects using a wide range of data engineering techniques in advanced analytics. Prioritize business and information needs with management; translate business logic into technical requirements, such as creating queries, stored procedures, and scripts. Interpret data, process it, analyze results, present findings, and provide ongoing reports. Develop and implement databases, data collection systems, data analytics, and strategies that optimize data efficiency and quality. Acquire data from primary or secondary sources and maintain databases/data systems. Identify and define new process improvement opportunities. Manage and support data solutions in BAU scenarios, including data profiling, designing data flow, creating business alerts for fields, and query optimization for ML models. Essential Skills/Experience: BS/MS in a quantitative field (Computer Science, Data Science, Engineering, Information Systems, Economics) 5+ years of work experience with DB skills like Python, SQL, Snowflake, Amazon Redshift, MongoDB, Apache Spark, Apache Airflow, AWS cloud and Amazon S3 experience, Oracle, Teradata Good experience in Apache Spark or Talend Administration Center or AWS Lambda, MongoDB, Informatica, SQL Server Integration Services Experience in building ETL pipeline and data integration Build efficient Data Management (Extract, consolidate and store large datasets with improved data quality and consistency) Streamlined data transformation: Convert raw data into usable formats at scale, automate tasks, and apply business rules Good written and verbal skills to communicate complex methods and results to diverse audiences; willing to work in a cross-cultural environment Analytical mind with problem-solving inclination; proficiency in data manipulation, cleansing, and interpretation Experience in support and maintenance projects, including ticket handling and process improvement Setting up Workflow Orchestration (Schedule and manage data pipelines for smooth flow and automation) Importance of Scalability and Performance (handling large data volumes with optimized processing capabilities) Experience with Git Desirable Skills/Experience: Knowledge of distributed computing and Big Data Technologies like Hive, Spark, Scala, HDFS; use these technologies along with statistical tools like Python/R Experience working with HTTP requests/responses and API REST services Familiarity with data visualization tools like Tableau, Qlik, Power BI, Excel charts/reports Working knowledge of Salesforce/Veeva CRM, Data governance, and Data mining algorithms Hands-on experience with EHR, administrative claims, and laboratory data (e.g., Prognos, IQVIA, Komodo, Symphony claims data) Good experience in consulting, healthcare, or biopharmaceuticals When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca's Alexion division, you'll find an environment where your work truly matters. Embrace the opportunity to grow and innovate within a rapidly expanding portfolio. Experience the entrepreneurial spirit of a leading biotech combined with the resources of a global pharma. You'll be part of an energizing culture where connections are built to explore new ideas. As a member of our commercial team, you'll meet the needs of under-served patients worldwide. With tailored development programs designed for skill enhancement and fostering empathy for patients' journeys, you'll align your growth with our mission. Supported by exceptional leaders and peers across marketing and compliance, you'll drive change with integrity in a culture celebrating diversity and innovation. Ready to make an impact? Apply now to join our team! Date Posted 29-Jul-2025 Closing Date 04-Aug-2025 Alexion is proud to be an Equal Employment Opportunity and Affirmative Action employer. We are committed to fostering a culture of belonging where every single person can belong because of their uniqueness. The Company will not make decisions about employment, training, compensation, promotion, and other terms and conditions of employment based on race, color, religion, creed or lack thereof, sex, sexual orientation, age, ancestry, national origin, ethnicity, citizenship status, marital status, pregnancy, (including childbirth, breastfeeding, or related medical conditions), parental status (including adoption or surrogacy), military status, protected veteran status, disability, medical condition, gender identity or expression, genetic information, mental illness or other characteristics protected by law. Alexion provides reasonable accommodations to meet the needs of candidates and employees. To begin an interactive dialogue with Alexion regarding an accommodation, please contact accommodations@Alexion.com. Alexion participates in E-Verify.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Experience: 3-5 years Looking for a skilled backend developer with strong experience in Java, Spring Boot, and Apache Spark. Responsible for building scalable microservices and processing large datasets in real-time or batch environments. Must have solid understanding of REST APIs, distributed systems, and data pipelines. Experience with cloud platforms (AWS/GCP) is a plus.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed according to the specified requirements and are aligned with the business goals. Your typical day will involve collaborating with the team to understand the application requirements, designing and developing the applications using PySpark, and configuring the applications to meet the business process needs. You will also be responsible for testing and debugging the applications to ensure their functionality and performance. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Design and build applications using PySpark. - Configure applications to meet business process requirements. - Collaborate with the team to understand application requirements. - Test and debug applications to ensure functionality and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Apache Spark. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 5 years of experience in PySpark. - This position is based at our Chennai office. - A 15 years full time education is required.

Posted 1 week ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Role Summary Digital Experience (DX) ( https://www.adobe.com/experience-cloud.html) is a USD 4B+ business serving the needs of enterprise businesses including 95%+ of fortune 500 organizations. Adobe Experience Manager, within Adobe DX is the world’s largest CMS platform, is a solution that helps enterprises create, manage, and deliver digital experiences across various channels like websites, mobile apps, and digital signage. According to a Forrester report, Experience Manager is the most robust CMS on the market. More than 128,000 websites rely on the agile setup of Experience Manager to manage their content. We are looking for strong and passionate engineers/managers to join our team as we scale the business by building the next gen products and adding customer value to our existing offerings. If you’re passionate about innovative technology, then we would be excited to talk to you! What You'll Do Mentor and guide a high-performing engineering team to deliver outstanding results Lead the technical design, vision, and implementation strategy for next-gen Multi-cloud services Partner with global leaders to help craft product architecture, roadmap, and release plans Drive strategic decisions ensuring successful project delivery and high code quality Apply standard methodologies and coding patterns to develop maintainable and modular solutions Optimize team efficiency through innovative engineering processes and teamwork models Attract, hire, and retain top talent while encouraging a positive, collaborative culture Lead discussions on emerging industry technologies and influence product direction What you need to succeed 12+ years of experience in software development with a proven leadership track record, min 3 years as manager leading a team of high performing full stack engineers. Proficiency in Java/JSP for backend development and experience with frontend technologies like React, Angular, or JQuery Experience with cloud platforms such as AWS or Azure Proficiency in version control, CI/CD pipelines, and DevOps practices Familiarity with Docker, Kubernetes, and Infrastructure as Code tools Experience with Web-Sockets, or event-driven architectures Deep understanding of modern software architecture, including microservices and API-first development Proven usage of AI/GenAI engineering productivity tools like github copilot, cursor. Practical experience with Python would be helpful. Exposure to open source contribution models to Apache, Linux foundation projects or any other 3rd party frameworks would be an added advantage. Strong problem-solving, analytical, and decision-making skills Excellent communication, collaboration, and management skills Passion for high-quality software and improving engineering processes BS/MS or equivalent experience in Computer Science or a related field Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies