Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel industry, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem. Our solutions power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and many other platforms, connecting people with moments that matter. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. In this role, you will plan, design, develop, and test data science and data engineering software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: - Develop, code, test, and debug new complex data-driven software solutions or enhancements to existing products. - Design, plan, develop, and improve applications using advanced cloud-native technology. - Work on issues requiring in-depth knowledge of organizational objectives and implement strategic policies in selecting methods and techniques. - Encourage high coding standards, best practices, and high-quality output. - Interact regularly with subordinate supervisors, architects, product managers, HR, and others on project or team performance matters. - Provide technical mentorship and cultural/competency-based guidance to teams. - Offer larger business/product context and mentor on specific tech stacks/technologies. Qualifications and Education Requirements: - Minimum 4-6 years of related experience as a full-stack developer. - Expertise in Data Engineering/DW projects with Google Cloud-based solutions. - Designing and developing enterprise data solutions on the GCP cloud platform. - Experience with relational databases and NoSQL databases like Oracle, Spanner, BigQuery, etc. - Expert-level SQL skills for data manipulation, validation, and manipulation. - Experience in designing data modeling, data warehouses, data lakes, and analytics platforms on GCP. - Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse. - Strong experience in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modeling. - Collaboration with data scientists, data teams, and engineering teams using Google Cloud platform for data analysis and data modeling. - Familiarity with integrating datasets from multiple sources for data modeling for analytical and AI/ML models. - Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker. - Expertise in Java Spring Boot / Python or other programming languages used for Data Engineering and integration projects. - Strong problem-solving and analytical skills. - Exposure to AI/ML, MLOPS, and Vertex AI is an advantage. - Familiarity with DevOps practices like CICD pipeline. - Airline domain experience is a plus. - Excellent spoken and written communication skills. - GCP Cloud Data Engineer Professional certification is a plus. We will carefully consider your application and review your details against the position criteria. Only candidates who meet the minimum criteria for the role will proceed in the selection process.,
Posted 4 days ago
5.0 - 10.0 years
10 - 16 Lacs
Bengaluru
Work from Office
Description: 5+ years of experience in Java, Springboot, Microservices, ReactJS, product development and sustenance Troubleshooting and debugging of existing code when required. Proficient in code quality, security compliance and app performance mgmt Participation in agile planning process and estimation of planned tasks Good verbal written communication skills Good expertise in unit testing (Junit) Requirements: Qualifications & Experience • 5+ years of experience developing and designing software applications using Java • Expert understanding of core computer science fundamentals including data structures, algorithms, and concurrent programming • Expert in analyzing, designing, implementing and troubleshooting software solutions for highly transactional systems. • Expert in OOAD and design principals, implementing micro services architecture using JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow. • Experience working in Native & Hybrid Cloud environment. • Experience with Agile development methodology. • Proficiency in agile software development including technical skillsets such as programming (e.g., Python, Java), multi-tenant cloud technologies, and product management tools (e.g., Jira) • Strong collaboration and communication skills to effectively work across the product team with product and technology team members and clearly articulate technical ideas • Ability to translate strategic priorities as features and user stories into scalable solutions that are structured, efficient, and user-centric • Detail-oriented problem solver who can break down complex issues to deliver effectively • Excellent communication and team player with can-do attitude. • Ability to analyze user and business requirements to create technical design requirements and software architecture • Experience must also include: • Java • Java IDE like Eclipse or IntelliJ • Java EE Application servers like Apache Tomcat • Object-oriented design, Git, Maven, and a popular scripting language • JSON, XML, YAML, Terraform scripting languages Preferred Skills/Experience: • Champion of Agile Scrum methodologies • Experience continuous integration systems like Jenkins or GitHub CI • Experience with SAFe methodologies • Deep knowledge and understanding to create secure solutions by design • Multi-threaded backend environments with concurrent users • Experience with tools or languages like: • Ruby, Python, Perl, Node.js and bash scripting languages • Spring, Spring Boot • C, C++, Java and Java EE development experience • Oracle • Docker • Kubernetes Job Responsibilities: Key Responsibilities & Deliverables • Feature implementation and production ready code • Technical documentation and system diagrams • Debugging reports and fixes • Performance optimizations What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 4 days ago
5.0 - 7.0 years
5 - 14 Lacs
Pune, Gurugram, Bengaluru
Work from Office
• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Python, and utilizing various GCP data services such as BigQuery, Bigtable, and Cloud SQL. Your expertise in data warehouse and data lake design and implementation, experience in data pipeline development and tuning, hands-on involvement in cloud migration and data lake projects, proficiency in Docker and GKE, strong SQL and Python scripting skills, and familiarity with GCP services like BigQuery, Cloud SQL, Dataflow, and Composer will be essential for this role. Additionally, knowledge of data governance principles, experience with dbt, and the ability to work effectively within a team and adapt to project needs are highly valued. Strong communication skills, the willingness to work in UK shift timings, and the openness to giving and receiving feedback are important traits that will contribute to your success in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You will be part of a dynamic team at Equifax, where we are seeking creative, high-energy, and driven software engineers with hands-on development skills to contribute to various significant projects. As a software engineer at Equifax, you will have the opportunity to work with cutting-edge technology alongside a talented group of engineers. This role is perfect for you if you are a forward-thinking, committed, and enthusiastic individual who is passionate about technology. Your responsibilities will include designing, developing, and operating high-scale applications across the entire engineering stack. You will be involved in all aspects of software development, from design and testing to deployment, maintenance, and continuous improvement. By utilizing modern software development practices such as serverless computing, microservices architecture, CI/CD, and infrastructure-as-code, you will contribute to the integration of our systems with existing internal systems and tools. Additionally, you will participate in technology roadmap discussions and architecture planning to translate business requirements and vision into actionable solutions. Working within a closely-knit, globally distributed engineering team, you will be responsible for triaging product or system issues and resolving them efficiently to ensure the smooth operation and quality of our services. Managing project priorities, deadlines, and deliverables will be a key part of your role, along with researching, creating, and enhancing software applications to advance Equifax Solutions. To excel in this position, you should have a Bachelor's degree or equivalent experience, along with at least 7 years of software engineering experience. Proficiency in mainstream Java, SpringBoot, TypeScript/JavaScript, as well as hands-on experience with Cloud technologies such as GCP, AWS, or Azure, is essential. You should also have a solid background in designing and developing cloud-native solutions and microservices using Java, SpringBoot, GCP SDKs, and GKE/Kubernetes. Experience in deploying and releasing software using Jenkins CI/CD pipelines, infrastructure-as-code concepts, Helm Charts, and Terraform constructs is highly valued. Moreover, being a self-starter who can adapt to changing priorities with minimal supervision could set you apart in this role. Additional advantageous skills include designing big data processing solutions, UI development, backend technologies like JAVA/J2EE and SpringBoot, source code control management systems, build tools, working in Agile environments, relational databases, and automated testing. If you are ready to take on this exciting opportunity and contribute to Equifax's innovative projects, apply now and be part of our team of forward-thinking software engineers.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. For the role at 66degrees, we are seeking a senior contractor to engage in a 2.5-month remote assignment with the potential to extend. Candidates with the required skills and the ability to work independently as well as within a team environment are encouraged to apply. As part of the responsibilities, you will be expected to facilitate, guide, and influence the client and teams towards an effective architectural pattern. You will serve as an interface between business leadership, technology leadership, and the delivery teams. Your role will involve performing Migration Assessments and producing Migration Plans that include Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, and Application Waves. Additionally, you will be responsible for designing a solution architecture on Google Cloud to support critical workloads. This will include Heterogeneous Oracle Migrations to Postgres or Spanner. You will need to design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users and Security. You will oversee migration activities and provide troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, and Technology reviews and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy. Your personal information is collected, used, and shared in accordance with the California Consumer Privacy Act (CCPA).,
Posted 1 week ago
4.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, users, technical architects, and application designers to define the solution requirements and structure for the platform Model and design the application data structure, storage, and integration Lead the database analysis, design, and build effort Work with the application architects and designers to design the integration solution Ensure that the database designs fulfill the requirements, including data volume, frequency needs, and long-term data growth Able to perform Data Engineering tasks using Spark Knowledge of developing efficient frameworks for development and testing using (Sqoop/Nifi/Kafka/Spark/Streaming/ WebHDFS/Python) to enable seamless data ingestion processes on to the Hadoop/BigQuery platforms. Enabling Data Governance and Data Discovery Exposure of Job Monitoring framework along validations automation Exposure of handling structured, Un Structured and Streaming data. Technical Skills Experience with building data platform on cloud (Data Lake, Data Warehouse environment, Databricks) Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data Proven background of designing and implementing architectural solutions which solve strategic and tactical business needs Deep knowledge of best practices through relevant experience across data-related disciplines and technologies, particularly for enterprise-wide data architectures, data management, data governance and data warehousing Highly competent with database design Highly competent with data modeling Strong Data Warehousing and Business Intelligence skills or including: Handling ELT and scalability issues for enterprise level data warehouse Creating ETLs/ELTs to handle data from various data sources and various formats Strong hands-on experience of programming language like Python, Scala with Spark and Beam. Solid hands-on and Solution Architecting experience in Cloud Technologies Aws, Azure and GCP (GCP preferred) Hands on working experience of data processing at scale with event driven systems, message queues (Kafka/ Flink/Spark Streaming) Hands on working Experience with GCP Services like BigQuery, DataProc, PubSub, Dataflow, Cloud Composer, API Gateway, Datalake, BigTable, Spark, Apache Beam, Feature Engineering/Data Processing to be used for Model development Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/ asynchronous using MQ, Kafka, Steam processing Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data Must be very strong in writing SparkSQL queries Strong organizational skills, with the ability to work autonomously as well as leading a team Pleasant Personality, Strong Communication & Interpersonal Skills Qualifications A bachelor's degree in computer science, computer engineering, or a related discipline is required to work as a technical lead Certification in GCP would be a big plus Individuals in this field can further display their leadership skills by completing the Project Management Professional certification offered by the Project Management Institute.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a member of the JM Financial team, you will be part of a culture that values recognition and rewards for the hard work and dedication of its employees. We believe that a motivated workforce is essential for the growth of our organization. Our management team acknowledges and appreciates the efforts of our personnel through promotions, bonuses, awards, and public recognition. By fostering an atmosphere of success, we celebrate achievements such as successful deals, good client ratings, and customer reviews. Nurturing talent is a key focus at JM Financial. We aim to prepare our employees for future leadership roles by creating succession plans and encouraging direct interactions with clients. Knowledge sharing and cross-functional interactions are integral to our business environment, fostering inclusivity and growth opportunities for our team members. Attracting and managing top talent is a priority for JM Financial. We have successfully built a diverse talent pool with expertise, new perspectives, and enthusiasm. Our strong brand presence in the market enables us to leverage the expertise of our business partners to attract the best talent. Trust is fundamental to our organization, binding our programs, people, and clients together. We prioritize transparency, two-way communication, and trust across all levels of the organization. Opportunities for growth and development are abundant at JM Financial. We believe in growing alongside our employees and providing them with opportunities to advance their careers. Our commitment to nurturing talent has led to the appointment of promising employees to leadership positions within the organization. With a focus on employee retention and a supportive environment for skill development, we aim to create a strong future leadership team. Emphasizing teamwork, we value both individual performance and collaborative group efforts. In a fast-paced corporate environment, teamwork is essential for achieving our common vision. By fostering open communication channels and facilitating information sharing, we ensure that every member of our team contributes to delivering value to our clients. As a Java Developer at JM Financial, your responsibilities will include designing, modeling, and building services to support new features and products. You will work on an integrated central platform to power various web applications, developing a robust backend framework and implementing features across different products using a combination of technologies. Researching and implementing new technologies to enhance our services will be a key part of your role. To excel in this position, you should have a BTech Degree in Computer Science or equivalent experience, with at least 3 years of experience building Java-based web applications in Linux/Unix environments. Proficiency in scripting languages such as JavaScript, Ruby, or Python, along with compiled languages like Java or C/C++, is required. Experience with Google Cloud Platform services, knowledge of design methodologies for backend services, and building scalable infrastructure are essential skills for this role. Our technology stack includes JavaScript, Angular, React, NextJS, HTML5/CSS3/Bootstrap, Windows/Linux/OSX Bash, Kookoo telephony, SMS Gupshup, Sendgrid, Optimizely, Mixpanel, Google Analytics, Firebase, Git, Bash, NPM, Browser Dev Console, NoSQL, Google Cloud Datastore, Google Cloud Platform (App Engine, PubSub, Cloud Functions, Bigtable, Cloud Endpoints). If you are passionate about technology and innovation, and thrive in a collaborative environment, we welcome you to join our team at JM Financial.,
Posted 1 week ago
5.0 - 8.0 years
15 - 20 Lacs
Pune
Hybrid
We have an opening for Java GCP at Pune only. Please Let me know, if you fine for any of the location, will process your profile immediately. Experience: 5-8Years Notice Period: 0-30Days Mandatory skills : Java - spring boot, GCP Pub/sub, Eventos, Big data, Bigtable, BigQuery, Composer/Airflow
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a member of the Flipkart team focused on GenZ, you will be at the forefront of the company's strategic growth bet on Video & Live Commerce. The core pillars of this capability are enhancing user experience, empowering creators, and encouraging seller/brands participation. Your primary goal will be to videofy the Flipkart app across various discovery points such as homepage, S&B, and Product Page, while also creating a dedicated discovery destination where users can explore inspirational content akin to TikTok or Instagram reels. You will be instrumental in developing a next-generation Live streaming experience that supports concurrent livestreams for millions of users. Additionally, your role will involve leading the development of cutting-edge systems aimed at enhancing personalization through relevant product discovery for each user. Leveraging GenAI technology, you will drive automated quality control of Images, Videos, Creators, and content to deliver a more personalized shopping experience. Your responsibilities will include driving hyper-personalization of the user experience using Machine Learning and Data Science techniques at various stages of the funnel. By utilizing data-driven insights and a growth mindset, you will continuously strive to enhance user experience at scale, ensuring the delivery of video reels with minimal size compression and latency. From a technical perspective, you will work with a cutting-edge tech stack that includes technologies and frameworks like Kafka, Zookeeper, Apache Pulsar, Spark, Bigtable, HBase, Redis, MongoDB, Elasticsearch, Docker, Kubernetes, and various Video technologies such as OBS, RTMP, Jitsi, and Transcoder. Your role will involve collaborating with diverse stakeholders to deliver scalable and quality technology solutions, while also facilitating platform solutions that extend beyond your team to the wider ecosystem. As an Engineering Manager (EM), you will lead a team of engineers across different levels, guiding them towards realizing Flipkart's vision. You will be responsible for setting the direction and long-term vision for the team, partnering with product, business, and other stakeholders to bring this vision to life. Your role will involve providing technical leadership, creating clear career paths for team members, attracting and retaining top talent, driving strategy and vision, and fostering a strong team culture of responsiveness and agility in execution. Overall, as a key member of the Flipkart team, you will play a crucial role in driving innovation, personalization, and growth in the realm of Video & Live Commerce, while also contributing to the technical excellence and strategic direction of the organization.,
Posted 2 weeks ago
7.0 - 12.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Work Location: Bangalore/Pune/Hyderabad/ NCR Experience: 5-12yrs Required Skills: Proven experience as a Data Engineer with expertise in GCP. Strong understanding of data warehousing concepts and ETL processes. Experience with BigQuery, Dataflow, and other GCP data services Design, develop, and maintain data pipelines on GCP. Implement data storage solutions and optimize data processing workflows. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data scientists and analysts to understand data requirements. Monitor and maintain the health of the data infrastructure. Troubleshoot and resolve data-related issues. Thanks & Regards Suganya R Suganya@spstaffing.in
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. In this role, you will be a senior contractor engaged on a 2.5-month remote assignment with the potential to extend. We are looking for candidates with required skills who can work independently as well as within a team environment. Your responsibilities will include facilitating, guiding, and influencing the client and teams towards an effective architectural pattern. You will become an interface between business leadership, technology leadership, and the delivery teams. Additionally, you will perform Migration Assessments and Produce Migration Plans that encompass Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, Application Waves, designing solution architecture on Google Cloud to support critical workloads, and Heterogeneous Oracle Migrations to Postgres or Spanner. You will design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users, and Security. Your role will also involve overseeing migration activities and providing troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, Technology reviews, and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy and handles personal information in accordance with the California Consumer Privacy Act (CCPA).,
Posted 2 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Mumbai, Mangaluru
Hybrid
6 months-3 yrs of IT experience Knowledge on Bigquery, SQL Or similar tools Aware of ETL and Data warehouse concepts Good oral and written communication skills Great team player and able to work efficiently with minimal supervision Should have good knowledge of Java or python to conduct data cleansing Preferred: Good communication and problem-solving skills Experience on Spring Boot would be an added advantage Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable Experience in Google Cloud Platform (GCP) Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow) Knowledge of Microservices, Pub/Sub, Cloud Run, Cloud Function Roles and Responsibilities Develop high performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on GCP cloud using GCP data stores such as BigQuery Optimizing data pipelines for performance and cost for large scale data lakes. Writing complex, highly-optimized queries across large data sets and to create data processing layers. Closely interact with Data Engineers to identify right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs and other Data/ML engineers Research new use cases for existing data.
Posted 3 weeks ago
6.0 - 11.0 years
15 - 30 Lacs
Pune
Hybrid
Software Engineer - Specialist What youll do Demonstrate a deep understanding of cloud-native, distributed microservice-based architectures. Deliver solutions for complex business problems through standard Software Development Life Cycle (SDLC) practices. Build strong relationships with both internal and external stakeholders, including product, business, and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed. Lead strong technical teams that deliver complex software solutions that scale. Work across teams to integrate our systems with existing internal systems. Participate in a tight-knit, globally distributed engineering team. Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure. Leverage strong experience in full-stack software development and public cloud platforms like GCP and AWS. Mentor, coach, and develop junior and senior software, quality, and reliability engineers. Ensure compliance with secure software development guidelines and best practices. Define, maintain, and report SLA, SLO, and SLIs meeting EFX engineering standards in partnership with the product, engineering, and architecture teams. Collaborate with architects, SRE leads, and other technical leadership on strategic technical direction, guidelines, and best practices. Drive up-to-date technical documentation including support, end-user documentation, and run books. Responsible for implementation architecture decision-making associated with Product features/stories and refactoring work decisions. Create and deliver technical presentations to internal and external technical and non-technical stakeholders, communicating with clarity and precision, and presenting complex information in a concise, audience-appropriate format. What experience you need Bachelor's degree or equivalent experience. 5+ years of software engineering experience. 5+ years experience writing, debugging, and troubleshooting code in mainstream Java and SpringBoot. 5+ years experience with Cloud technology: GCP, AWS, or Azure. 5+ years experience designing and developing cloud-native solutions. 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes. 5+ years experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others. 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understanding infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills. Strong leadership qualities. Demonstrated problem-solving skills and the ability to resolve conflicts. Experience creating and maintaining product and software roadmaps. Working in a highly regulated environment. Experience on GCP in Big data and distributed systems - Dataflow, Apache Beam, Pub/Sub, BigTable, BigQuery, GCS. Experience with backend technologies such as JAVA/J2EE, SpringBoot, Golang, gRPC, SOA, and Microservices. Source code control management systems (e.g., SVN/Git, Github, Gitlab), build tools like Maven & Gradle, and CI/CD like Jenkins or Gitlab. Agile environments (e.g., Scrum, XP). Relational databases (e.g., SQL Server, MySQL). Atlassian tooling (e.g., JIRA, Confluence, and Github). Developing with modern JDK (v1.7+). Automated Testing: JUnit, Selenium, LoadRunner, SoapUI.
Posted 3 weeks ago
11.0 - 17.0 years
45 - 50 Lacs
Pune
Work from Office
: Job Title: Fintech Product Engineering Lead Corporate Title: VP Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities: Ability to navigate a strong sense of urgency while maintaining focus and clarity. Skilled at solving complex design challenges independently , without needing oversight. Proven track record of quickly delivering high-quality code and features . Able to inspire and energise teams through urgency, ownership, and technical excellence. Willing to do whatever it takes to ensure product success , from strategy to hands-on execution. Deep experience in architecting scalable systems (HLD & LLD) in fast-paced environments. Comfortable leading through ambiguity, change, and high-growth pressure . Known for balancing speed with engineering quality and operational readiness . Strong communicator can align teams, resolve conflicts, and drive decisions fast. A true builder mindset acts with ownership, speed, and high accountability . Your skills and experience Hands-on experience in building responsive UIs withReact and Javascript. Hands-on knowledge ofGo(Golang)/ Java and GIN/ SpringBoot framework for backend development. Proficient in HTML, CSS and styling tools like Tailwind. Proficient inRESTful, GraphQLandgRPCfor building scalable and high-performance APIs. Experience with GCP/AWS, for building scalable, resilient micro-service based architectures. Experience with relational and NoSQL databases (e.g.,PostgreSQL,MySQL,Firestore,BigTable). Experience with logging, monitoring and alerting using ( egGrafana, Prometheus, ELK ) Familiarity with CI/CD pipelines, automated testing and deployment strategies with detailed knowledge on Terrafom. Knowledge of best practices for building secure applications (e.g., mTLS, Encryption, OAuth, JWT and Data Compliance). Knowledge of disaster recovery, zero-downtime deploys, and backup strategies How well support you
Posted 3 weeks ago
0.0 - 3.0 years
6 - 8 Lacs
Noida
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 3 weeks ago
4.0 - 8.0 years
22 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 3 weeks ago
7.0 - 12.0 years
25 - 27 Lacs
Hyderabad
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Key Responsibilities: Design, develop, and maintain backend services and APIs using Python (Flask/Django/FastAPI). Develop scalable and secure microservices for data processing, analytics, and APIs. Manage and optimize data storage with SQL (PostgreSQL/MySQL) and NoSQL databases (MongoDB/Firestore/Bigtable). Design and implement CI/CD pipelines and automate cloud deployments on GCP (App Engine, Cloud Run, Cloud Functions, GKE). Collaborate with front-end developers, product owners, and other stakeholders to integrate backend services with business logic and UI. Optimize application performance and troubleshoot issues across backend systems. Implement best practices in code quality, testing (unit/integration), security, and scalability. Qualifications: Bachelors or masters degree in computer science, Data Science, or a related field. Must have 3+ years of relevant IT experience Strong hands-on programming experience in Python. Experience with one or more Python frameworks: Flask, Django, or FastAPI. Deep understanding of RESTful API design and development. Proficient in working with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Firestore, BigQuery). Solid understanding and experience with GCP services Familiarity with Git, CI/CD tools (e.g., Cloud Build, Jenkins, GitHub Actions). Strong debugging, problem-solving, and performance tuning skills.
Posted 3 weeks ago
5.0 - 10.0 years
18 - 25 Lacs
Sholinganallur
Hybrid
Skills Required:Big Query,, BigTable, Data Flow, Pub/Sub, Data fusion, Dataproc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Function, App Engine, AIRFLOW, Cloud Storage, BigTable, Cloud Spanner Skills Preferred:ETL Experience Required:• 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines Thanks & Regards, Varalakshmi V 9019163564
Posted 4 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
About the Team When 5% of Indian households shop with us, its important to build resilient systems to manage millions of orders every day. Weve done this with zero downtime! ?? Sounds impossible? Well, thats the kind of Engineering muscle that has helped Meesho become the e-commerce giant that it is today. We value speed over perfection and see failures as opportunities to become better. Weve taken steps to inculcate a strong Founders Mindset across our engineering teams, making us grow and move fast. We place special emphasis on the continuous growth of each team member - and we do this with regular 1-1s and open communication. As a Database Engineer II, you will be part of self-starters who thrive on teamwork and constructive feedback. We know how to party as hard as we work! If we arent building unparalleled tech solutions, you can find us debating the plot points of our favorite books and games or even gossiping over chai. So, if a day filled with building impactful solutions with a fun team sounds appealing to you, join us. About the Role As a Database Engineer II, youll establish and implement the best Nosql Database Engineering practices proactively. Youll have opportunities to work on different Nosql technologies on a large scale. Youll also work closely with other engineering teams and establish seamless collaborations within the organization. Being proficient in emerging technologies and the ability to work successfully with a team is key to success in this role. What you will do Manage, maintain and monitor a multitude of Relational/NoSQL databases clusters, ensuring obligations to SLAs. Manage both in-house and SaaS solutions in the Public cloud (Or 3rd party).Diagnose, mitigate and communicate database-related issues to relevant stakeholders. Design and Implement best practices for planning, provisioning, tuning, upgrading and decommissioning of database clusters. Understand the cost optimization aspects of such tools/softwares and implement cost control mechanisms along with continuous improvement. Advice and support product, engineering and operations teams. Maintain general backup/recovery/DR of data solutions. Work with the engineering and operations team to automate new approaches for scalability, reliability and performance. Perform R&D on new features and for innovative solutions. Participate in on-call rotations. What you will need 5 years+ experience in provisioning & managing Relational/NoSQL databases. Proficiency in two or more: Mysql,PostgreSql, Big Table ,Elastic Search, MongoDB, Redis, ScyllaDB. Proficiency in Python programming language. Experience with deployment orchestration, automation, and security configuration management (Jenkins, Terraform, Ansible). Hands-on experience with Amazon Web Services (AWS)/ Google Cloud Platform (GCP).Comfortable working in Linux/Unix environments. Knowledge of TCP/IP stack, Load balancer, Networking. Proven ability to drive projects to completion. A degree in computer science, software engineering, information technology or related fields will be an advantage.
Posted 1 month ago
4.0 - 9.0 years
10 - 15 Lacs
Pune
Work from Office
MS Azure Infra (Must), PaaS will be a plus, ensuring solutions meet regulatory standards and manage risk effectively. Hands-On Experience using Terraform to design and deploy solutions (at least 5+ years), adhering to best practices to minimize risk and ensure compliance with regulatory requirements. Primary Skill AWS Infra along with PaaS will be an added advantage. Certification in Terraform is an added advantage. Certification in Azure and AWS is an added advantage. Can handle large audiences to present HLD, LLD, and ERC. Able to drive Solutions/Projects independently and lead projects with a focus on risk management and regulatory compliance. Secondary Skills Amazon Elastic File System (EFS) Amazon Redshift Amazon S3 Apache Spark Ataccama DQ Analyzer AWS Apache Airflow AWS Athena Azure Data Factory Azure Data Lake Storage Gen2 (ADLS) Azure Databricks Azure Event Hub Azure Stream Analytics Azure Synapse Analytics BigID C++ Cloud Storage Collibra Data Governance (DG) Collibra Data Quality (DQ) Data Lake Storage Data Vault Modeling Databricks DataProc DDI Dimensional Data Modeling EDC AXON Electronic Medical Record (EMR) Extract, Transform & Load (ETL) Financial Services Logical Data Model (FSLDM) Google Cloud Platform (GCP) BigQuery Google Cloud Platform (GCP) Bigtable Google Cloud Platform (GCP) Dataproc HQL IBM InfoSphere Information Analyzer IBM Master Data Management (MDM) Informatica Data Explorer Informatica Data Quality (IDQ) Informatica Intelligent Data Management Cloud (IDMC) Informatica Intelligent MDM SaaS Inmon methodology Java Kimball Methodology Metadata Encoding & Transmission Standards (METS) Metasploit Microsoft Excel Microsoft Power BI NewSQL noSQL OpenRefine OpenVAS Performance Tuning Python R RDD Optimization SaS SQL Tableau Tenable Nessus TIBCO Clarity
Posted 1 month ago
10.0 - 15.0 years
30 - 40 Lacs
Noida, Pune, Bengaluru
Hybrid
Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Good To Have:- Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc
Posted 1 month ago
5.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,Bachelor Of Technology,Master Of Engineering,Master Of Technology,Intergrated course BCA+MCA,Master of Science (Technology),Bachelor Of Science (Tech),Bachelor Of Comp. Applications Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : Primary skills:Technology-Big Data-Big Table,Technology-Cloud Integration-Azure Data Factory (ADF),Technology-Data On Cloud - Platform-AWS Preferred Skills: Technology-Big Data-Big Table-GCP Technology-Data On Cloud - Platform-AWS Technology-Cloud Integration-Azure Data Factory (ADF)
Posted 1 month ago
10.0 - 15.0 years
12 - 16 Lacs
Pune
Work from Office
To be successful in this role, you should meet the following requirements(Must have ) Payments and Banking experience is a must. Experience in implementing and monitoring data governance using standard methodology throughout the data life cycle, within a large organisation. Demonstrate up-to-date knowledge of data governance theory, standard methodology and the practical considerations. Demonstrate knowledge of data governance industry standards and tools. Overall experience of 10+ years experience in Data governance, encompassing Data Quality management, Master data management, Data privacy & compliance, Data cataloguing and metadata management, Data security, maturity and lineage. Prior experience in implementing an end-to-end data governance framework. Experience in Automating Data cataloguing, ensuring accurate, consistent metadata, making data easily discoverable and usable. Domain experience across the payments and banking lifecycle. An analytical mind and inclination for problem-solving, with an attention to detail. Ability to effectively navigate and deliver transformation programmes in large global financial organisations, amidst the challenges posed by bureaucracy, globally distributed teams and local data regulations. Strong communication skills coupled with presentation skills of complex information and data. A first-class degree in Engineering or relevant field - with 2 or more of the following subjects as a major Mathematics, Computer Science, Statistics, Economics. The successful candidate will also meet the following requirements(Good to have ) Database TypeRelational, NoSQL, DocumentDB DatabasesOracle, PostgreSQL, BigQuery, Big Table, MongoDB, Neo4j Experience in Conceptual/ Logical/Physical Data Modeling. Experience in Agile methodology and leading agile delivery, aligned with organisational needs. Effective leader as well as team player with a strong commitment to quality and efficiency.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough