Home
Jobs
Companies
Resume

236 Nifi Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

3 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

As a Senior Associate L2 in Data Engineering , you'll be responsible for translating client requirements into technical designs and implementing components for data engineering solutions. This role requires a deep understanding of data integration and big data design principles to create custom solutions or implement package solutions. You'll independently drive design discussions to ensure the robust health of the overall solution. Your Impact: What You'll Achieve Contribute to Data Ingestion, Integration, and Transformation . Work with Data Storage and Computation Frameworks , focusing on performance optimizations. Support Analytics & Visualizations . Develop solutions related to Infrastructure & Cloud Computing . Engage with Data Management Platforms . Build functionality for data ingestion from multiple heterogeneous sources in both batch and real-time. Build functionality for data analytics, search, and aggregation . Qualifications: Your Skills & Experience Minimum 3 years of experience in Big Data technologies. Hands-on experience with the Hadoop stack , including HDFS, Sqoop, Kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, Hive, Oozie, Airflow , and other components necessary for building end-to-end data pipelines. Working knowledge of real-time data pipelines is an added advantage. Strong experience in at least one programming language : Java, Scala, or Python , with Java being preferable . Hands-on working knowledge of NoSQL and MPP data platforms like HBase, MongoDB, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery, etc. Well-versed and possess working knowledge of data platform-related services on Azure . Bachelor's degree and 6 to 8 years of work experience , or any combination of education, training, and/or experience that demonstrates the ability to perform the duties of the position. Set Yourself Apart With Good knowledge of traditional ETL tools (Informatica, Talend, etc.) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands-on experience. Knowledge of data governance processes (security, lineage, catalog) and tools like Collibra, Alation, etc. Knowledge of distributed messaging frameworks like ActiveMQ / RabbitMQ / Solace, search & indexing, and Microservices architectures. Experience in performance tuning and optimization of data pipelines . Cloud data specialty and other related Big Data technology certifications .

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 09 The Team We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What You Can Expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests. Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code. Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code. Collaborate effectively with technical and non-technical stakeholders. Respond to and resolve production issues. What We Are Looking For Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools. Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required Technical Skills Build data pipelines. Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow. Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData. Write infrastructure as code to develop sandbox environments. Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed. Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable Technical Skills Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Data Engineer Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Services The Services org is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million-dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Advanced Analytics Program Within the Services Technology Team, the Advanced Analytics program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Credit Risk, Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API and web application-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are looking for an innovative lead data engineer who will lead the technical design and development of an Analytic Foundation. The Analytic Foundation is a suite of individually commercialized analytical capabilities that also includes a comprehensive data platform. These services will be offered through a series of APIs that deliver data and insights from various points along a central data store. This individual will partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Audiences uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Credit Risk products are a new suite of APIs and tooling to provide lenders real-time access to KPIs and insights serving thousands of clients to make smarter risk decisions using Mastercard data. Help found a new, fast-growing engineering team! Position Responsibilities As a Lead Data Engineer within Advanced Analytics team, you will: Lead collaboration with data scientists to understand the existing modeling pipeline and identify optimization opportunities. Oversee the integration and management of data from various sources and storage systems, establishing processes and pipelines to produce cohesive datasets for analysis and modeling. Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Design and develop data pipelines to automate repetitive tasks within data science and data engineering. Demonstrated experience leading cross-functional teams or working across different teams to solve complex problems. Identify patterns and innovative solutions in existing spaces, consistently seeking opportunities to simplify, automate tasks, and build reusable components for multiple use cases and teams. Create data products that are well-modeled, thoroughly documented, and easy to understand and maintain. Comfortable leading projects in environments with undefined or ambiguous requirements. Mentor junior data engineers Ideal Candidate Qualifications High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Oozie, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms Extensive experience with Spark Processing engine. Proficiency in, at least, one modern programming language such as Python, Java or Scala Strong Computer Science fundamentals in object-oriented design, data structures, algorithm design, problem solving, and complexity analysis. Ability to easily move between business, data management, and technical teams; ability to quickly intuit the business use case and identify technical solutions to enable it Working Knowledge in Software Development engineering Paradigms along with Data Engineering. Relational Databases as well as NoSQL experience Experience in cloud technologies like Databricks/AWS/Azure Basic Shell scripting and knowledge of Linux/Unix systems Experience in designing & developing software at scale Strong written and verbal English communication skills. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-248104 Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Data Engineer II Overview Mastercard Data Warehouse builds and supports various analytics and reporting platform that includes Domo, Power BI, Tableau, SSRS and WebFOCUS. The ecosystem also includes Microsoft BI stack involving SSAS & SSIS and Alteryx ETL tool. We are looking for data engineer who can build efficient data pipelines to feed downstream reporting and analytics platform Ideal candidates will answer yes to all of the following questions: Have you worked in an agile development team? Are you proficient in Big Data, ETL/ELT tool (example NiFi) Are you passionate about building technology to deliver quality solutions providing superior customer experience? Do you thrive in a fast-paced environment and have the ability to learn new skills and applications quickly? Role Description The Data Engineer II will participate on data management aspects of client engagements to deliver data products, as well as contribute to and foster a high-performance collaborative workplace. A Data Engineer II will: Independently execute projects through design, implementation, automation, and maintenance of large-scale enterprise ETL processes for a global client base Develop repeatable and scalable code that processes client data in an automated and efficient manner to ensure data availability in the platform is as real-time as possible. Act as an expert data resource within the team Manage the process of data delivery on teams by overseeing other Data Engineers and Analysts to deliver on-time, accurate, high-value, robust data solutions across multiple clients, solutions, and industry sectors Build trust-based working relationships with peers and clients across local and global teams Leverage industry best practices including proper use of source control, code reviews, data validation and testing Enhance big data pipelines using Python, R, SSIS and Powershell to address complex technical challenges and seamless communication with several cloud storage technologies Leverage new SQL Server features such as Columnstore Indexes, In-Memory OLTP, Incremental statistics, Trace Flags, SQL CLR functions, window aggregate functions, and parallel computing algorithms to reduce the processing time of multi-billion row data sets Contribute to the automation capabilities of the team. Implement techniques to optimize and routinize repeatable tasks Comply with and uphold all Mastercard internal policies and external regulations All About You Bachelor's degree in Computer Science, Software Engineering, or a related field Extensive hands-on experience in Data Engineering, including implementing multiple end-to-end data warehouse projects in Big Data environments. Proficiency in application development frameworks (Python, Java/Scala) and data processing/storage frameworks (Hadoop, Spark, Kafka). Experience in developing data orchestration workflows using tools such as Apache NiFi, Apache Airflow, or similar platforms to automate and streamline data pipelines. Experience with performance tuning of database schemas, databases, SQL, ETL jobs, and related scripts. Experience of working in Agile teams Experience in development of data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale using Java, Scala, or Python. This includes all phases such as data ingestion, feature engineering, modeling, tuning, evaluating, monitoring, and presenting analytics. Experience in developing integrated cloud applications with services like Azure, Databricks, AWS or GCP. Excellent analytical and problem-solving skills, with the ability to analyze complex data issues and develop practical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with and facilitate activities across cross-functional teams, geographically distributed, and stakeholders. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-248120 Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role The Senior Software Engineer will be responsible for development solutions with high level of innovation, high quality and faster time to market. This position interacts with product managers, engineering leaders, architects and software developers and business operations on the definition and delivery of highly scalable and secure solutions. The role includes: • Hands-on developer who writes high quality, secure code that is modular, functional and testable. • Create or introduce, test, and deploy new technology to optimize the service • Contribute to all parts of the software’s development including design, development, documentation, and testing. • Have strong ownership • Communicate, collaborate and work effectively in a global environment. • Responsible for ensuring application stability in production by creating solutions that provide operational health. • Mentoring and leading new developers while driving modern engineering practices. • Communicate, collaborate and work effectively in a global environment All About You • Strong analytical and excellent problem-solving skills and experience working in an Agile environment. • Experience with XP, TDD and BDD in the software development processes • Proficiency in Java, Scala & SQL (Oracle, Postgres, H2, Hive, & HBase) & building pipelines • Expertise and Deep understanding on Hadoop Ecosystem including HDFS, YARN, MapReduce, Tools like Hive, Pig/Flume, Data processing framework like Spark & Cloud platform, Orchestration Tools - Apache Nifi / Airflow, Apache Kafka • Expertise in Web applications (Springboot Angular, Java, PCF), Web Services (REST/OAuth), and Big Data Technologies (Hadoop, Spark, Hive, HBase) and tools ( Sonar, Splunk, Dynatrace) • Expertise SQL, Oracle and Postgres • Experience in microservices, event driven architecture • Soft skills: strong verbal and written communication to demo features to product owners; strong leadership quality to mentor and support junior team members, proactive and has initiative to take development work from inception to implementation. • Familiar with secure coding standards (e.g., OWASP, CWE, SEI CERT) and vulnerability management Show more Show less

Posted 2 weeks ago

Apply

125.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position In Roche Informatics, we build on Roche’s 125-year history as one of the world’s largest biotech companies, globally recognized for providing transformative innovative solutions across major disease areas. We combine human capabilities with cutting-edge technological innovations to do now what our patients need next. Our commitment to our patients’ needs motivates us to deliver technology that evolves the practice of medicine. Be part of our inclusive team at Roche Informatics, where we’re driven by a shared passion for technological novelties and optimal IT solutions. Position Overview We are seeking an experienced ETL Architect to design, develop, and optimize data extraction, transformation, and loading (ETL) solutions and to work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. This role requires deep expertise in Python, AWS Cloud, and ETL tools to build and maintain scalable data pipelines and architectures. The ETL Architect will work closely with cross-functional teams to ensure efficient data integration, storage, and accessibility for business intelligence and analytics. Key Responsibilities ETL Design & Development: Architect and implement high-performance ETL pipelines using AWS cloud services, Snowflake, and ETL tools such as Talend, Dbt, Informatica, ADF etc Data Architecture: Design and implement scalable, efficient, and cloud-native data architectures Data Integration & Flow: Ensure seamless data integration across multiple source systems, leveraging AWS Glue, Snowflake, and other ETL tools Performance Optimization: Monitor and tune ETL processes for performance, scalability, and cost-effectiveness Governance & Security: Establish and enforce data quality, governance, and security standards for ETL processes Collaboration: Work with data engineers, analysts, and business stakeholders to define data requirements and ensure effective solutions Documentation & Best Practices: Maintain comprehensive documentation and promote best practices for ETL development and data transformation Troubleshooting & Support: Diagnose and resolve performance issues, failures, and bottlenecks in ETL processes Required Qualifications Education: Bachelor's or Master’s degree in Computer Science, Information Technology, Data Engineering, or related field Experience: 6+ years of experience in ETL development, with 3+ years in an ETL architecture role Expertise in Snowflake or any MPP data warehouse (including Snowflake data modeling, optimization, and security best practices) Strong experience with AWS Cloud services, especially AWS Glue, AWS Lambda, S3, Redshift, and IAM or Azure/GCP cloud services Proficiency in ETL tools such as Informatica, Talend, Apache NiFi, SSIS, or DataStage Strong SQL skills and experience with relational and NoSQL databases Experience in API integrations Proficiency in scripting languages (Python, Shell, PowerShell) for automation Prior experience in Pharmaceutical or Diagnostics or healthcare domain is a plus Soft Skills Strong analytical and problem-solving abilities Excellent communication and documentation skills Ability to work collaboratively in a fast-paced, cloud-first environment Preferred Qualifications Certifications in AWS, Snowflake, or ETL tools Experience in real-time data streaming, microservices-based architectures, and DevOps for data pipelines Knowledge of data governance, compliance (GDPR, HIPAA), and security best practices Who we are A healthier future drives us to innovate. Together, more than 100’000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let’s build a healthier future, together. Roche is an Equal Opportunity Employer. Show more Show less

Posted 2 weeks ago

Apply

50.0 years

5 - 9 Lacs

Gurgaon

On-site

About the Opportunity Job Type: Permanent Application Deadline: 23 June 2025 Job Description Title Expert Engineer Department GPS Technology Location Gurugram, India Reports To Project Manager Level Grade 4 We’re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like your part of something bigger. About your team The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Matallion / DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPC’s Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment / evaluate / adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional Requirements like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gates on investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errors etc. Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment. Feel rewarded For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 2 weeks ago

Apply

4.0 years

8 - 10 Lacs

Bengaluru

On-site

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Job Description – Big Data Engineer You Lead the Way. We’ve Got Your Back. At American Express, we know that with the right backing, people and businesses have the power to progress in incredible ways. Whether we’re supporting our customers’ financial confidence to move ahead, taking commerce to new heights, or encouraging people to explore the world, our colleagues are constantly redefining what’s possible — and we’re proud to back each other every step of the way. When you join #TeamAmex, you become part of a diverse community of over 60,000 colleagues, all with a common goal to deliver an exceptional customer experience every day. At American Express, we know that with the right backing, people and businesses have the power to progress in incredible ways. Whether we’re supporting our customers’ financial confidence to move ahead, taking commerce to new heights, or encouraging people to explore the world, our colleagues are constantly redefining what’s possible - and we’re proud to back each other every step of the way. When you join #TeamAmex, you become part of a diverse community of over 60,000 colleagues, all with a common goal to deliver an exceptional customer experience every day.” You won’t just keep up; you’ll break new ground. We are on a journey to create a best-in-class Product & Engineering Organization. Our Product people not only understand how technology works, but how that technology intersects with the people who count on it every day. Today, innovative ideas, insight and new perspectives are at the core of how we create a more powerful, personal and fulfilling experience for all our customers. So if you’re interested in a career creating breakthrough products and making an impact on our sales teams, account development organization and customers, look no further. As part of the Enterprise Platforms Sales & Marketing organization , here’s just some of what you’ll be doing: The Salesforce platform is part of the larger Enterprise Platforms Centre of Excellence where product, engineering, and delivery colleagues have been brought together to drive a step function change in speed to market and further unlock customer value across several the company’s core platforms. The global Salesforce platform team sits at the heart of American Express’ global marketing, sales, and account development functions for all lines of business. This platform is a critical enabler of American Express’ customer centricity and growth objectives. As a Backend Engineer, you will be responsible for designing, implementing and maintaining services which connects Amex Salesforce platform with various enterprise capabilities. Your role will be pivotal in enabling data and integration capabilities both in real-time and batch mode to ensure that our Sales and Marketing teams have access to accurate and timely data for decision making. T Job Responsibilities: Serving as a core member of an Engineering team, that designs and develop software applications Effectively interpreting technical and business objectives and challenges and articulating sound solutions. Provide technical expertise in driving projects from inception to closure Work independently as well as collaborate effectively with cross functional teams on case-by-case basis. Work with Product team, other data engineers and DevOps to deliver features on time. Be a change agent with hands-on ability to build POC products and set the best practices. Perform code reviews and help the team to produce quality code. Work in a geographically distributed team setup. Identify opportunities for further enhancements and refinements to standards and processes. Fine tune the existing application with new ideas and optimization opportunities to reduce the latency. Scoping/Effort Estimates: contribute to solution scoping and effort sizing with a cross-functional team. Research: stay informed on technology trends. Qualifications: Bachelor’s degree in engineering or Computer Science or equivalent OR master’s in computer applications or equivalent 4+ years of experience in service/framework development using Java/J2EE , Spring Boot , Microservices along with Bigdata technologies like Hadoop, Hive and spark. 2. 2+ years of experience in service/framework development using Hadoop/HIVE/SQL/Shell Scripting/Spark/Python/Scala/GCP Cloud/Tableau/Nifi experience preferred Technical Skills: Demonstrated hands on experience in application design, software development, and automated testing, 4+ years of Web Services development experience – REST/SOAP/GRAPHQL 4+ years of experience with Spring, Spring MVC/Spring Boot/Spring cloud is required 2+ years of Big Data applications 2+ years of experience with Hadoop/HIVE/SQL/Shell Scripting/Spark/Python/Scala/GCP Cloud/Tableau/Nifi(or any ETL) 4+ Agile development experience Expertise in JVM memory management and multithreading concepts Experience with ORM tools like Hibernate/JPA Experience with microservice architecture and distributed (multi tiered) systems. Experience with Kafka, RABITMQ Expertise on Bigdata technologies like Map-Reduce, Hive, Spark Exposure in reactive/ asynchronous programming. Exposure in building cloud native applications. Exposure to Google cloud platform Expertise with logging and monitoring tools Splunk, Dynatrace, ELF Proficient in writing MemSQL and Postgres SQL Expertise in objected oriented analysis, design and design patterns Experience with continuous integration/deployment(Jenkins, XLR ,GITHUB actions etc.) Ability to effectively communicate with internal and external business partners Strong written, verbal communications, presentation skills, problem solving and analytical skills Willingness to understand the business and participate in discussions around project requirements Knowledge of cloud platforms like GCP is good to have We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 2 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary: We are seeking a skilled Big Data Tester & Developer to design, develop, and validate data pipelines and applications on large-scale data platforms. You will work on data ingestion, transformation, and testing workflows using tools from the Hadoop ecosystem and modern data engineering stacks. Experience - 6-12 years Key Responsibilities: Develop and test Big Data pipelines using Spark, Hive, Hadoop, and Kafka Write and optimize PySpark/Scala code for data processing Design test cases for data validation, quality, and integrity Automate testing using Python/Java and tools like Apache Nifi, Airflow, or DBT Collaborate with data engineers, analysts, and QA teams ⸻ Key Skills: Strong hands-on experience in Big Data tools: Spark, Hive, HDFS, Kafka Proficient in PySpark, Scala, or Java Experience in data testing, ETL validation, and data quality checks Familiarity with SQL, NoSQL, and data lakes Knowledge of CI/CD, Git, and automation frameworks We are looking for a skilled PostgreSQL Developer/DBA to design, implement, optimize, and maintain our PostgreSQL database systems. You will work closely with developers and data teams to ensure high performance, scalability, and data integrity. Key Responsibilities: Experience - 6 to 12 years Develop complex SQL queries, stored procedures, and functions Optimize query performance and database indexing Manage backups, replication, and security Monitor and tune database performance Support schema design and data migrations Key Skills: Strong hands-on experience with PostgreSQL Proficient in SQL, PL/pgSQL scripting Experience in performance tuning, query optimization, and indexing Familiarity with logical replication, partitioning, and extensions Exposure to tools like pgAdmin, psql, or PgBouncer Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Senior Associate / Manager ? Nifi Developer Job Location: Pan India Candidate should possess 8 to 12years of experience where 3+ years? should be relevant. Roles & Responsibilities: ? Design, develop, and manage data pipelines using Apache NiFi ? Integrate with systems like Kafka, HDFS, Hive, Spark, and RDBMS ? Monitor, troubleshoot, and optimize data flows ? Ensure data quality, reliability, and security ? Work with cross-functional teams to gather requirements and deliver data solutions Skills Required: ? Strong hands-on experience with Apache NiFi ? Knowledge of data ingestion, streaming, and batch processing ? Experience with Linux, shell scripting, and cloud environments (AWS/GCP is a plus) ? Familiarity with REST APIs, JSON/XML, and data transformation Skills Required RoleNifi Developer - SA/M Industry TypeIT/ Computers - Software Functional AreaIT-Software Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills NIFI DEVELOPER DESIGN DEVELOPMENT Other Information Job CodeGO/JC/21435/2025 Recruiter NameSPriya Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

NIFI + SQL - JD Resource should have relevant 4 +Years of experience in Nifi development project Good in SQL Good in Unix Good in AWS/willing to learn AWS Strong in writing complex queries Good communication and attitude Willing to work in support /development project 3 days in Office No long leave plans after joining project Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Senior Associate / Manager ? Nifi Developer Job Location: Pan India Candidate should possess 8 to 12years of experience where 3+ years? should be relevant. Roles & Responsibilities: ? Design, develop, and manage data pipelines using Apache NiFi ? Integrate with systems like Kafka, HDFS, Hive, Spark, and RDBMS ? Monitor, troubleshoot, and optimize data flows ? Ensure data quality, reliability, and security ? Work with cross-functional teams to gather requirements and deliver data solutions Skills Required: ? Strong hands-on experience with Apache NiFi ? Knowledge of data ingestion, streaming, and batch processing ? Experience with Linux, shell scripting, and cloud environments (AWS/GCP is a plus) ? Familiarity with REST APIs, JSON/XML, and data transformation Skills Required RoleNifi Developer - SA/M Industry TypeIT/ Computers - Software Functional AreaIT-Software Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills NIFI DEVELOPER DESIGN DEVELOPMENT Other Information Job CodeGO/JC/21435/2025 Recruiter NameSPriya Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Bengaluru South, Karnataka, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Job Description – Big Data Engineer You Lead the Way. We’ve Got Your Back. At American Express, we know that with the right backing, people and businesses have the power to progress in incredible ways. Whether we’re supporting our customers’ financial confidence to move ahead, taking commerce to new heights, or encouraging people to explore the world, our colleagues are constantly redefining what’s possible — and we’re proud to back each other every step of the way. When you join #TeamAmex, you become part of a diverse community of over 60,000 colleagues, all with a common goal to deliver an exceptional customer experience every day. At American Express, we know that with the right backing, people and businesses have the power to progress in incredible ways. Whether we’re supporting our customers’ financial confidence to move ahead, taking commerce to new heights, or encouraging people to explore the world, our colleagues are constantly redefining what’s possible - and we’re proud to back each other every step of the way. When you join #TeamAmex, you become part of a diverse community of over 60,000 colleagues, all with a common goal to deliver an exceptional customer experience every day.” You won’t just keep up; you’ll break new ground. We are on a journey to create a best-in-class Product & Engineering Organization. Our Product people not only understand how technology works, but how that technology intersects with the people who count on it every day. Today, innovative ideas, insight and new perspectives are at the core of how we create a more powerful, personal and fulfilling experience for all our customers. So if you’re interested in a career creating breakthrough products and making an impact on our sales teams, account development organization and customers, look no further. As part of the Enterprise Platforms Sales & Marketing organization , here’s just some of what you’ll be doing: The Salesforce platform is part of the larger Enterprise Platforms Centre of Excellence where product, engineering, and delivery colleagues have been brought together to drive a step function change in speed to market and further unlock customer value across several the company’s core platforms. The global Salesforce platform team sits at the heart of American Express’ global marketing, sales, and account development functions for all lines of business. This platform is a critical enabler of American Express’ customer centricity and growth objectives. As a Backend Engineer, you will be responsible for designing, implementing and maintaining services which connects Amex Salesforce platform with various enterprise capabilities. Your role will be pivotal in enabling data and integration capabilities both in real-time and batch mode to ensure that our Sales and Marketing teams have access to accurate and timely data for decision making. T Job Responsibilities: Serving as a core member of an Engineering team, that designs and develop software applications Effectively interpreting technical and business objectives and challenges and articulating sound solutions. Provide technical expertise in driving projects from inception to closure Work independently as well as collaborate effectively with cross functional teams on case-by-case basis. Work with Product team, other data engineers and DevOps to deliver features on time. Be a change agent with hands-on ability to build POC products and set the best practices. Perform code reviews and help the team to produce quality code. Work in a geographically distributed team setup. Identify opportunities for further enhancements and refinements to standards and processes. Fine tune the existing application with new ideas and optimization opportunities to reduce the latency. Scoping/Effort Estimates: contribute to solution scoping and effort sizing with a cross-functional team. Research: stay informed on technology trends. Qualifications: · Bachelor’s degree in engineering or Computer Science or equivalent OR master’s in computer applications or equivalent 4+ years of experience in service/framework development using Java/J2EE , Spring Boot , Microservices along with Bigdata technologies like Hadoop, Hive and spark. 2. 2+ years of experience in service/framework development using Hadoop/HIVE/SQL/Shell Scripting/Spark/Python/Scala/GCP Cloud/Tableau/Nifi experience preferred Technical Skills: Demonstrated hands on experience in application design, software development, and automated testing, 4+ years of Web Services development experience – REST/SOAP/GRAPHQL 4+ years of experience with Spring, Spring MVC/Spring Boot/Spring cloud is required 2+ years of Big Data applications 2+ years of experience with Hadoop/HIVE/SQL/Shell Scripting/Spark/Python/Scala/ GCP Cloud/Tableau/Nifi (or any ETL) 4+ Agile development experience Expertise in JVM memory management and multithreading concepts Experience with ORM tools like Hibernate/JPA Experience with microservice architecture and distributed (multi tiered) systems. Experience with Kafka, RABITMQ Expertise on Bigdata technologies like Map-Reduce, Hive, Spark Exposure in reactive/ asynchronous programming. Exposure in building cloud native applications. Exposure to Google cloud platform Expertise with logging and monitoring tools Splunk, Dynatrace, ELF Proficient in writing MemSQL and Postgres SQL Expertise in objected oriented analysis, design and design patterns Experience with continuous integration/deployment(Jenkins, XLR ,GITHUB actions etc.) Ability to effectively communicate with internal and external business partners Strong written, verbal communications, presentation skills, problem solving and analytical skills Willingness to understand the business and participate in discussions around project requirements Knowledge of cloud platforms like GCP is good to have We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location – Gurgaon Experience – 5-10 years Role – Big Data Architect. About the Role: As a Big Data Engineer, you will play a critical role in integrating multiple data sources, designing scalable data workflows, and collaborating with data architects, scientists, and analysts to develop innovative solutions. You will work with rapidly evolving technologies to achieve strategic business goals. Must-Have Skills: 4+ year’s of mandatory experience with Big data. 4+ year’s mandatory experience in Apache Spark. Proficiency in Apache Spark, Hive on Tez, and Hadoop ecosystem components. Strong coding skills in Python & Pyspark. Experience building reusable components or frameworks using Spark Expertise in data ingestion from multiple sources using APIs, HDFS, and NiFi. Solid experience working with structured, unstructured, and semi-structured data formats (Text, JSON, Avro, Parquet, ORC, etc.). Experience with UNIX Bash scripting and databases like Postgres, MySQL and Oracle. Ability to design, develop, and evolve fault-tolerant distributed systems. Strong SQL skills, with expertise in Hive, Impala, Mongo and NoSQL databases. Hands-on with Git and CI/CD tools Experience with streaming data technologies (Kafka, Spark Streaming, Apache Flink, etc.). Proficient with HDFS, or similar data lake technologies Excellent problem-solving skills — you will be evaluated through coding rounds Key Responsibilities: Must be capable of handling existing or new Apache HDFS cluster having name node, data node & edge node commissioning & decommissioning. Work closely with data architects and analysts to design technical solutions. Integrate and ingest data from multiple source systems into big data environments. Develop end-to-end data transformations and workflows, ensuring logging and recovery mechanisms. Must able to troubleshoot spark job failures. Design and implement batch, real-time, and near-real-time data pipelines. Optimize Big Data transformations using Apache Spark, Hive, and Tez Work with Data Science teams to enhance actionable insights. Ensure seamless data integration and transformation across multiple systems. Show more Show less

Posted 2 weeks ago

Apply

50.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

About The Opportunity Job Type: Permanent Application Deadline: 23 June 2025 Job Description Title Expert Engineer Department GPS Technology Location Gurugram, India Reports To Project Manager Level Grade 4 We’re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like your part of something bigger. About Your Team The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. About Your Role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About You Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Matallion / DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPC’s Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment / evaluate / adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional Requirements like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gates on investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errors etc. Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment. Feel rewarded For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. Show more Show less

Posted 2 weeks ago

Apply

0.0 - 8.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

About the Role: Grade Level (for internal use): 09 The Team: We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact: You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect: An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities: We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests . Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code . Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code . Collaborate effectively with technical and non-technical stakeholders . Respond to and resolve production issues. What we are looking for: Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools . Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required technical skills: Build data pipelines . Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow . Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData . Write infrastructure as code to develop sandbox environments . Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed . Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable technical skills: Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Location Gurugram, Haryana, India Category Corporate Job Id GGN00002011 Marketing / Loyalty / Mileage Plus / Alliances Job Type Full-Time Posted Date 06/02/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description Description - External United's Kinective Media Data Engineering team designs, develops, and maintains massively scaling ad- technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values : At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial projects with a digital focus. Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution. This role will also drive the adoption of data processing and analysis within the AWS environment and help cross train other members of the team. Leverage strategic and analytical skills to understand and solve customer and business centric questions. Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business. Develop and implement innovative solutions leading to automation Use of Agile methodologies to manage projects Mentor and train junior engineers. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications Qualifications - External Required BS/BA, in computer science or related STEM field 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala 2+ years of experience with Big Data technologies like PySpark, Hadoop, Hive, HBASE, Kafka, Nifi 2+ years of experience with database systems like redshift,MS SQL Server, Oracle, Teradata. Creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights Individuals who have a natural curiosity and desire to solve problems are encouraged to apply 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala Must be legally authorized to work in India for any employer without sponsorship Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position Preferred Masters in computer science or related STEM field Experience with cloud based systems like AWS, AZURE or Google Cloud Certified Developer / Architect on AWS Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills Strong knowledge in Big Data

Posted 2 weeks ago

Apply

0.0 - 8.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Engineer, Software Engineering Hyderabad, India Information Technology 311642 Job Description About The Role: Grade Level (for internal use): 09 The Team: We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact: You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect: An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities: We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests. Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code. Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code. Collaborate effectively with technical and non-technical stakeholders. Respond to and resolve production issues. What we are looking for: Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools. Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required technical skills: Build data pipelines. Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow. Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData. Write infrastructure as code to develop sandbox environments. Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed. Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable technical skills: Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India

Posted 2 weeks ago

Apply

0 years

9 - 10 Lacs

Pune

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Lead Product Manager- Technical Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview: We are looking for an Lead Product Manager - Technical to join a to drive our strategy forward by consistently innovating and problem-solving. The ideal candidate is passionate about technology, highly motivated, intellectually curious, analytical, and possesses an entrepreneurial mindset. the PVS Identity Solutions team. Role Provide technical leadership for new major initiatives Deliver innovative, cost-effective solutions which align to enterprise standards Be an integrated part of an Agile engineering team, working interactively with software engineer leads, architects, testing engineers, from the beginning of the development cycle Help ensure functionality delivered in each release is fully tested end to end Manage multiple priorities and tasks in a dynamic work environment. Identifies issues that will keep the platform features from delivering on time and/or with the desired requirements and communicates to leadership Works with internal teams and customer service to identify, classify, and prioritize feature-level customer issues Coordinates internal forums to collect and identify feature-level development opportunities Owns and manages product documentation; enables self-service support and/or works to reduce overhead Identifies feature risks from business and customer feedback and in-depth analysis of operational performance; shares with senior leadership Digests business customer requirements (user stories, use cases) and platform requirements for a platform feature set Determines release goals for the platform and prioritizes assigned features according to business and platform value, adjusting throughout implementation as needed Reviews product demo with the development team against acceptance criteria for the feature set All About You Bachelor’s degree in computer science or equivalent work experience with hands on technical and quality engineering skills Excellent technical acumen, strong organizational and problem-solving skills with great attention to detail, critical thinking, solid communication, and proven leadership skills Solid leadership and mentoring skills with the ability to drive change Knowledge Java, SQLs, APIs (REST/SOAP), code reviews, scanning tools and configuration, and branching techniques Experience with application monitoring tools such as Dynatrace and Splunk Experience with Chaos, software security, and crypto testing practices Experience with DevOps practices (continuous integration and delivery, and tools such as Jenkins) Nice to have knowledge or prior experience with any of the following Orchestration with Apache Nifi, Apache Airflow Understanding about Microservices architecture. Take the time to fully learn the functionality, architecture, dependencies, and runtime properties of the systems supporting your platform products. This includes the business requirements and associated use cases, Mastercard customer's experience, Mastercard's back office systems, the technical stack (application/service architecture), interfaces and associated data flows, dependent applications/services, runtime operations (i.e. trouble management/associated support strategies), and maintenance. Understands and can explain the business context and the associated customer use cases Proficient at grooming user stories, setting entrance/exit criteria and prioritizing a platform product backlog Understands the technologies supporting the platform product and are able to hold your own in debates with other PM-Ts, TPMs, SDEs, and SPMs Recognize discordant views and take part in constructive dialog to resolve them Verbal and written communication is clear and concise Improve team processes that accelerate delivery, drive innovation, lower costs, and improve quality Corporate Security Responsibility Every person working for, or on behalf of, Mastercard is responsible for information security. All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and therefore, it is expected that the successful candidate for this position must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

Who You’ll Work With You will be part of the Digital Design & Merchandising, Product Creation, Planning, and Manufacturing Technology team at Converse. You will take direction and work primarily with the Demand and Supply team, supporting business planning space. You'll work with a talented team of engineers, data architects, and business stakeholders to design and implement scalable data integration solutions on cloud-based platforms to support our planning org. The successful candidate will be responsible for leading the integration of planning systems, processes, and data across the organization Who We Are Looking For We're looking for a seasoned Cloud Integration Lead with expertise in Databricks, Apache Spark, and cloud-based data integration. You'll have a strong technical background, excellent collaboration skills, and a passion for delivering high-quality solutions. The Ideal Candidate Will Have 5+ years of experience with Databricks, Apache Spark, and cloud-based data integration. Strong Technical expertise with cloud-based platforms, including AWS and or Azure cloud. Strong programming skills in languages like SQL, Python, Java, or Scala. 3+ years' experience with cloud-based data infrastructure and integration leveraging tools like S3, Airflow, EC2, AWS Glue, DynamoDB & Lambdas, Athena, AWS Code deploy, Azure Data Factory, or Google Cloud Dataflow. Experience with Jenkins and other CI/CD tools like GitLab CI/CD, CircleCI, etc. Experience with containerization using Docker and Kubernetes. Experience with infrastructure such as code using tools like Terraform or CloudFormation Experience with Agile development methodologies and version control systems like Git Experience with IT service management tools like ServiceNow, JIRA, etc. Data warehousing solutions, such as Amazon Redshift, Azure Synapse Analytics, or Google BigQuery will be a plus but not mandatory. Data science and machine learning concepts, including TensorFlow, PyTorch, or scikit-learn will be a plus but not mandatory. Strong technical background in computer science, software engineering, or a related field. Excellent collaboration, communication, and interpersonal skills. Experience with data governance, data quality, and data security principles. Ability to lead and mentor junior team members. AWS Certified Solutions Architect or AWS Certified Developer Associate or Azure Certified Solutions Architect certification. What You’ll Work On Design and implement scalable data integration solutions using Databricks, Apache Spark, and cloud-based platforms. Develop and implement cloud-based data pipelines using Databricks, Nifi, AWS Glue, Azure Data Factory, or Google Cloud Dataflow. Collaborate with cross-functional teams to deliver high-quality solutions that meet business requirements. Develop and maintain technical standards, best practices, and documentation. Integrate various data sources, including on-premises and cloud-based systems, applications, and databases. Ensure data quality, integrity, and security throughout the integration process. Collaborate with data engineering, data science, and business stakeholders to understand requirements and deliver solutions. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

Who You’ll Work With NIKE, Inc. does more than outfit the world's best athletes. It is a place to explore potential, obliterate boundaries and push out the edges of what can be. The company looks for people who can grow, think, dream and create. Its culture thrives by embracing diversity and rewarding imagination. The brand seeks achievers, leaders and visionaries. At Nike, it’s about each person bringing skills and passion to a challenging and constantly evolving game. This role is part of the Enterprise Architecture and Developer Platforms (EADP) and work with Managed Airflow Platform (MAP) part of Data Integration team. Who We Are Looking For We are looking for a Software Engineer II, MAP, EADP who excels in team environments and are excited about building cloud native platforms that can scale with the demand of our business. This role is part of MAP, EADP aggressively innovates solutions to drive growth while creating and implementing tools that help make everything else in the company possible. The candidate needs to have a strong understanding of technical concepts, excellent attention to detail, data accuracy, and data analysis, strong verbal and written communication skills, and be self-motivated and operates with a high sense of urgency and a high level of integrity. Key Skills & Traits Masters’ or Bachelors' degree in Computer Science or a related field 4+ years of experience in large-scale production-grade software development & platform engineering. 4+ years of hands-on experience with AWS or Azure or GCP 2+ years of experience developing data & analytics solutions using Airflow, Spark, NiFi, Kafka 4+ years of Strong Development experience in languages like Python and Java or Scala and Node.js. 2+ years of experience with ELK stack observability & monitoring tools like SignalFx, NewRelic etc. 2+ years of experience with DevOps stack: GitHub, Jenkins, Docker, Terraform, EKS etc Experience with both Relational and No-SQL databases and Data Lakes Experience with developing and securing RESTful APIs and Apps using OAuth2.0, OpenID Connect, and JWT a plus. Exposure to Agile and test-driven development a plus. Experience delivering projects in a highly collaborative, multi-discipline development team environment What You’ll Work On As a Software Engineer II, you will play a crucial role in shaping, modernizing, and scaling Nike’s cloud data orchestration platform while also driving innovation and automation within our data ecosystem. Platform Responsibilities Evangelize and cultivate adoption of Global Platforms, open-source software and agile principles within the organization Ensure solutions are designed and developed using a scalable, highly resilient cloud native architecture Deliver well-documented and well-tested code, and participate in peer code reviews Design and develop tools and frameworks to improve security, reliability, maintainability, availability and performance for the technology foundation of our platform Ensure product and technical features are delivered to spec and on-time Collaborate with and consult other Nike development teams, architecture teams etc. Explain designs and constraints to stakeholders and technical teams, gather alignment and buy-in Provide responsive support and operations for the platforms you help build. Work with product management to support product / service scoping activities Work with leadership to define delivery schedules of key features through an agile framework Be a key contributor to overall architecture, framework and design of global platforms Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Infrastructure Lead/Architect Job Type: Full-Time Location: On-site Hyderabad, Pune or New Delhi Job Summary Join our customer's team as an Infrastructure Lead/Architect and play a pivotal role in architecting, designing, and implementing next-generation cloud infrastructure solutions. You will drive cloud and data platform initiatives, ensure system scalability and security, and act as a technical leader, shaping the backbone of our customers’ mission-critical applications. Key Responsibilities Architect, design, and implement robust, scalable, and secure AWS cloud infrastructure utilizing services such as EC2, S3, Lambda, RDS, Redshift, and IAM. Lead the end-to-end design and deployment of high-performance, cost-efficient Databricks data pipelines, ensuring seamless integration with business objectives. Develop and manage data integration workflows using modern ETL tools in combination with Python and Java scripting. Collaborate with Data Engineering, DevOps, and Security teams to build resilient, highly available, and compliant systems aligned with operational standards. Act as a technical leader and mentor, guiding cross-functional teams through infrastructure design decisions and conducting in-depth code and architecture reviews. Oversee project planning, resource allocation, and deliverables, ensuring projects are executed on-time and within budget. Proactively identify infrastructure bottlenecks, recommend process improvements, and drive automation initiatives. Maintain comprehensive documentation and uphold security and compliance standards across the infrastructure landscape. Required Skills and Qualifications 8+ years of hands-on experience in IT infrastructure, cloud architecture, or related roles. Extensive expertise with AWS cloud services; AWS certifications are highly regarded. Deep experience with Databricks, including cluster deployment, Delta Lake, and machine learning integrations. Strong programming and scripting proficiency in Python and Java. Advanced knowledge of ETL/ELT processes and tools such as Apache NiFi, Talend, Airflow, or Informatica. Proven track record in project management, leading cross-functional teams; PMP or Agile/Scrum certifications are a plus. Familiarity with CI/CD workflows and Infrastructure as Code tools like Terraform and CloudFormation. Exceptional problem-solving, stakeholder management, and both written and verbal communication skills. Preferred Qualifications Experience with big data platforms such as Spark or Hadoop. Background in regulated environments (e.g., finance, healthcare). Knowledge of Kubernetes and AWS container orchestration (EKS). Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Infrastructure Lead/Architect Job Type: Full-Time Location: On-site Hyderabad, Pune or New Delhi Job Summary Join our customer's team as an Infrastructure Lead/Architect and play a pivotal role in architecting, designing, and implementing next-generation cloud infrastructure solutions. You will drive cloud and data platform initiatives, ensure system scalability and security, and act as a technical leader, shaping the backbone of our customers’ mission-critical applications. Key Responsibilities Architect, design, and implement robust, scalable, and secure AWS cloud infrastructure utilizing services such as EC2, S3, Lambda, RDS, Redshift, and IAM. Lead the end-to-end design and deployment of high-performance, cost-efficient Databricks data pipelines, ensuring seamless integration with business objectives. Develop and manage data integration workflows using modern ETL tools in combination with Python and Java scripting. Collaborate with Data Engineering, DevOps, and Security teams to build resilient, highly available, and compliant systems aligned with operational standards. Act as a technical leader and mentor, guiding cross-functional teams through infrastructure design decisions and conducting in-depth code and architecture reviews. Oversee project planning, resource allocation, and deliverables, ensuring projects are executed on-time and within budget. Proactively identify infrastructure bottlenecks, recommend process improvements, and drive automation initiatives. Maintain comprehensive documentation and uphold security and compliance standards across the infrastructure landscape. Required Skills and Qualifications 8+ years of hands-on experience in IT infrastructure, cloud architecture, or related roles. Extensive expertise with AWS cloud services; AWS certifications are highly regarded. Deep experience with Databricks, including cluster deployment, Delta Lake, and machine learning integrations. Strong programming and scripting proficiency in Python and Java. Advanced knowledge of ETL/ELT processes and tools such as Apache NiFi, Talend, Airflow, or Informatica. Proven track record in project management, leading cross-functional teams; PMP or Agile/Scrum certifications are a plus. Familiarity with CI/CD workflows and Infrastructure as Code tools like Terraform and CloudFormation. Exceptional problem-solving, stakeholder management, and both written and verbal communication skills. Preferred Qualifications Experience with big data platforms such as Spark or Hadoop. Background in regulated environments (e.g., finance, healthcare). Knowledge of Kubernetes and AWS container orchestration (EKS). Show more Show less

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Markovate. At Markovate, we dont just follow trendswe drive them. We transform businesses through innovative AI and digital solutions that turn vision into reality. Our team harnesses breakthrough technologies to craft bespoke strategies that align seamlessly with our clients' ambitions. From AI Consulting And Gen AI Development To Pioneering AI Agents And Agentic AI, We Empower Our Partners To Lead Their Industries With Forward-thinking Precision And Unmatched Overview We are seeking a highly experienced and innovative Senior Data Engineer with a strong background in hybrid cloud data integration, pipeline orchestration, and AI-driven data modelling. Requirements This role is responsible for designing, building, and optimizing robust, scalable, and production-ready data pipelines across both AWS and Azure platforms, supporting modern data architectures such as CEDM and Data Vault Requirements : 9+ years of experience in data engineering and data architecture. Excellent communication and interpersonal skills, with the ability to engage with teams. Strong problem-solving, decision-making, and conflict-resolution abilities. Proven ability to work independently and lead cross-functional teams. Ability to work in a fast-paced, dynamic environment and handle sensitive issues with discretion and professionalism. Ability to maintain confidentiality and handle sensitive information with attention to detail with discretion. The candidate must have strong work ethics and trustworthiness. Must be highly collaborative and team oriented with commitment to Responsibilities : Design and develop hybrid ETL/ELT pipelines using AWS Glue and Azure Data Factory (ADF). Process files from AWS S3 and Azure Data Lake Gen2, including schema validation and data profiling. Implement event-based orchestration using AWS Step Functions and Apache Airflow (Astronomer). Develop and maintain bronze ? silver ? gold data layers using DBT or Coalesce. Create scalable ingestion workflows using Airbyte, AWS Transfer Family, and Rivery. Integrate with metadata and lineage tools like Unity Catalog and Open Metadata. Build reusable components for schema enforcement, EDA, and alerting (e.g., MS Teams). Work closely with QA teams to integrate test automation and ensure data quality. Collaborate with cross-functional teams including data scientists and business stakeholders to align solutions with AI/ML use cases. Document architectures, pipelines, and workflows for internal stakeholders. Experience with cloud platforms: AWS (Glue, Step Functions, Lambda, S3, CloudWatch, SNS, Transfer Family) and Azure (ADF, ADLS Gen2, Azure Functions, Event Grid). Skilled in transformation and ELT tools: Databricks (PySpark), DBT, Coalesce, and Python. Proficient in data ingestion using Airbyte, Rivery, SFTP/Excel files, and SQL Server extracts. Strong understanding of data modeling techniques including CEDM, Data Vault 2.0, and Dimensional Modelling. Hands-on experience with orchestration tools such as AWS Step Functions, Airflow (Astronomer), and ADF Triggers. Expertise in monitoring and logging with CloudWatch, AWS Glue Metrics, MS Teams Alerts, and Azure Data Explorer (ADX). Familiar with data governance and lineage tools: Unity Catalog, OpenMetadata, and schema drift detection. Proficient in version control and CI/CD using GitHub, Azure DevOps, CloudFormation, Terraform, and ARM templates. Experienced in data validation and exploratory data analysis with pandas profiling, AWS Glue Data Quality, and Great to have: Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data and AI services. Knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica). Deep understanding of AI/Generative AI concepts and frameworks (e.g., TensorFlow, PyTorch, Hugging Face, OpenAI APIs). Experience with data modeling, data structures, and database design. Proficiency with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). Proficiency in SQL and at least one programming language (e.g., Python, it's like to be at Markovate : At Markovate, we thrive on collaboration and embrace every innovative idea. We invest in continuous learning to keep our team ahead in the AI/ML landscape. Transparent communication is keyevery voice at Markovate is valued. Our agile, data-driven approach transforms challenges into opportunities. We offer flexible work arrangements that empower creativity and balance. Recognition is part of our DNAyour achievements drive our success. Markovate is committed to sustainable practices and positive community impact. Our people-first culture means your growth and well-being are central to our mission. Location : hybrid model 2 days onsite. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

hackajob is collaborating with OneAdvanced to connect them with exceptional tech professionals for this role. Senior Data Integration Engineer IN-KA-Bengaluru Role Introduction We are seeking a Data Integration Specialist who will be responsible for ensuring seamless data flow between various systems, using several integration toolsets, managing data integration processes, and maintaining data quality and accessibility. You will work closely with our Data Analyst and report to the Data Eco-System Leader. This is a new role in a developing team. You will be in on the Ground Floor so will help to shape how we mature in this space. What You Will Do Key Responsibilities Design and Develop Data Integration Solutions: Create and implement data integration processes using ETL (Extract, Transform, Load) tools to consolidate data from various sources into cohesive data models. Build integration scripts and flows: As defined by the Business Stakeholders build and or change Integrations already developed within the business. Data Quality Management: Conduct data quality assessments and implement measures to enhance data accuracy and integrity. Operationalise the data exceptions and reporting around the integration of data. Collaboration: Work closely within and across the functional teams to gather requirements and understand diverse data sources, ensuring that integration strategies align with business objectives. Monitoring and Troubleshooting: Oversee data integration workflows, resolve issues, and optimize performance to ensure reliable data flow. Documentation: Maintain comprehensive documentation of data integration processes, data flows, and system configurations. Stay Updated: Keep abreast of industry trends and best practices in data integration and management What You Will Have Technical Skills & Qualifications Delivery focus: you will have led a team or been the deputy manager for a delivery focused team, preferably cross discipline. Technical Expertise: Extensive knowledge of data integration tools and languages, such as Dell Boomi, Rest API, Microsoft Fabric, Integration Hub, SQL, ETL, and XML. Problem-Solving Skills: Strong analytical skills to interpret complex data and troubleshoot integration issues effectively. Communication Skills: Effective communication skills to liaise with multiple technical and business teams and explain complex data issues clearly. Experience: Proven experience as a Data Integration Specialist or a similar role, with hands-on experience using ETL tools like Talend, Informatica, or Apache Nifi. Education: A bachelor's degree in a related field such as Computer Science, Information Technology, or Engineering is typically required or proven experience in Data Mining, ETL and Data Analysis. Would be really good to have Tools: Experience with Boomi, Rest API, ServiceNow Integration Hub, JIRA and ITSM platforms is beneficial. Scripting: understanding and ability to design and script workflows and automations. Enterprise Systems: An understanding of data structures in Salesforce. What We Do For You Wellbeing focused - Our people are our greatest assets, and ensuring everyone feels their best self to come to work is integral. Annual Leave - 20 days of annual leave, plus public holidays Employee Assistance Programme - Free advice, support, and confidential counselling available 24/7. Personal Growth - We’re committed to enabling your growth personally and professionally through development programmes. Life Insurance - 2x annual salary Personal Accident Insurance - providing cover in the event of serious injury/illness. Performance Bonus - Our Group-wide bonus scheme enables you to reap the rewards of your success. Who We Are OneAdvanced is one UK's largest providers of business software and services serving 20,000+ global customers with an annual turnover of £330M+. We manage 1.5 million 111 calls per month, support over 2 million Further Education learners across the UK, handle over 10 million wills, and so much more. Our mission is to power the world of work and, as you can see, our software underpins some of the UK's most critical sectors. We invest in our brilliant people. They are at the heart of our success as we strive to be a diverse, inclusive and engaging place to work that not only powers the world of work, but empowers the growth, ambitions and talent of our people. Show more Show less

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies