Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will have the opportunity to work at Capgemini, a company that empowers you to shape your career according to your preferences. You will be part of a collaborative community of colleagues worldwide, where you can reimagine what is achievable and contribute to unlocking the value of technology for leading organizations to build a more sustainable and inclusive world. Your Role: - You should have a very good understanding of current work, tools, and technologies being used. - Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python are required. - Experience with Fact and Dimension tables, SCD is necessary. - Minimum 3 years of experience in GCP Data Engineering is mandatory. - Proficiency in Java/ Python/ Spark on GCP, with programming experience in Python, Java, or PySpark, SQL. - Hands-on experience with GCS (Cloud Storage), Composer (Airflow), and BigQuery. - Ability to work with handling big data efficiently. Your Profile: - Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. - Experience in pipeline development using Dataflow or Dataproc (Apache Beam etc). - Familiarity with other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions, etc. - Possess proven analytical skills and a problem-solving attitude. - Excellent communication skills. What you'll love about working here: - You can shape your career with a range of career paths and internal opportunities within the Capgemini group. - Access to comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - Opportunity to learn on one of the industry's largest digital learning platforms with access to 250,000+ courses and numerous certifications. About Capgemini: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini leverages its over 55-year heritage to unlock the value of technology for clients across the entire breadth of their business needs. The company delivers end-to-end services and solutions, combining strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, generative AI, cloud, and data, along with deep industry expertise and a strong partner ecosystem.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
Role Overview: As a Software Engineer 3 at Costco Wholesale, you will be an integral part of the Costco GDX Order Management Platform team. Your main responsibilities will revolve around designing, developing, and maintaining critical services that power Costcos multi-billion dollar eCommerce business. You will work in a fast-paced environment, focusing on back-end microservices development to enhance the customer experience across various digital touchpoints. Key Responsibilities: - Support Senior Engineers in designing the Order Management Platforms overall architecture with a focus on availability, reusability, interoperability, and security. - Perform development, optimization, and automation activities to support the implementation of the Order Management platform. - Utilize engineering best practices to deliver high-quality, scalable solutions. - Implement test-driven development practices to detect defects early in the development process. - Collaborate with Senior Engineers to maintain coding standards, architectural patterns, and development processes. - Conduct peer code reviews for changes made by other team members. - Work with the product team on feature/story grooming. - Participate in scrum ceremonies and collaborate with the team to define specifications and documentation across all phases of the product development cycle. Qualifications Required: - 8-12 years of experience - Proficiency in C#, TypeScript, REST, JSON, XML, YAML, Swagger, Microservices, and Rest API. - Hands-on experience in designing and developing containerized services based on the .Net Core framework. - Experience in writing unit tests using NUnit or similar framework. - Familiarity with CI/CD tools such as GitHub and Jenkins. - Strong expertise in API development, emphasizing security and performance. - Experience with microservice-based debugging and performance testing. - Development experience within an agile methodology. - Knowledge of database application development in relational and no-SQL platforms. - Excellent verbal and written communication skills to engage with technical and business audiences. - Ability to work under pressure, handle crisis situations with urgency, and demonstrate responsibility and self-motivation. - Detail-oriented with strong problem-solving skills and the ability to analyze potential future issues. - Willingness to support off-hours work as required, including weekends, holidays, and on-call responsibilities on a rotational basis. - Bachelor's degree in computer science, Engineering, or a related field. Additional Details of the Company: Costco Wholesale is a multi-billion-dollar global retailer operating in fourteen countries. Costco Wholesale India focuses on developing innovative solutions to enhance member experiences and drive IT as a core competitive advantage for Costco Wholesale. The collaborative work environment aims to support employees in delivering innovation and improving processes within the organization.,
Posted 5 days ago
8.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Platform Development and Machine Learning expert at Adobe, you will play a crucial role in changing the world through digital experiences by building scalable AI platforms and designing ML pipelines. Your responsibilities will include: - Building scalable AI platforms that are customer-facing and evangelizing the platform with customers and internal stakeholders. - Ensuring platform scalability, reliability, and performance to meet business needs. - Designing ML pipelines for experiment management, model management, feature management, and model retraining. - Implementing A/B testing of models and designing APIs for model inferencing at scale. - Demonstrating proven expertise with MLflow, SageMaker, Vertex AI, and Azure AI. - Serving as a subject matter expert in LLM serving paradigms and possessing deep knowledge of GPU architectures. - Expertise in distributed training and serving of large language models and proficiency in model and data parallel training using frameworks like DeepSpeed and service frameworks like vLLM. - Demonstrating proven expertise in model fine-tuning and optimization techniques to achieve better latencies and accuracies in model results. - Reducing training and resource requirements for fine-tuning LLM and LVM models. - Having extensive knowledge of different LLM models and providing insights on the applicability of each model based on use cases. - Delivering end-to-end solutions from engineering to production for specific customer use cases. - Showcasing proficiency in DevOps and LLMOps practices, including knowledge in Kubernetes, Docker, and container orchestration. - Deep understanding of LLM orchestration frameworks like Flowise, Langflow, and Langgraph. Your skills matrix should include expertise in LLM such as Hugging Face OSS LLMs, GPT, Gemini, Claude, Mixtral, Llama, LLM Ops such as ML Flow, Langchain, Langraph, LangFlow, Flowise, LLamaIndex, SageMaker, AWS Bedrock, Vertex AI, Azure AI, Databases/Datawarehouse like DynamoDB, Cosmos, MongoDB, RDS, MySQL, PostGreSQL, Aurora, Spanner, Google BigQuery, Cloud Knowledge of AWS/Azure/GCP, Dev Ops knowledge in Kubernetes, Docker, FluentD, Kibana, Grafana, Prometheus, and Cloud Certifications (Bonus) in AWS Professional Solution Architect, AWS Machine Learning Specialty, Azure Solutions Architect Expert. Proficiency in Python, SQL, and Javascript is also required. Adobe is committed to creating exceptional employee experiences and values diversity. If you require accommodations to navigate the website or complete the application process, please contact accommodations@adobe.com or call (408) 536-3015.,
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
About AutoZone: AutoZone is the nation's leading retailer and a leading distributor of automotive replacement parts and accessories with more than 6,000 stores in US, Puerto Rico, Mexico, and Brazil. Each store carries an extensive line for cars, sport utility vehicles, vans and light trucks, including new and remanufactured hard parts, maintenance items and accessories. We also sell automotive diagnostic and repair software through ALLDATA, diagnostic and repair information through ALLDATAdiy.com, automotive accessories through AutoAnything.com and auto and light truck parts and accessories through AutoZone.com. Since opening its first store in Forrest City, Ark. on July 4, 1979, the company has joined the New York Stock Exchange (NYSE: AZO) and earned a spot in the Fortune 500. AutoZone has been committed to providing the best parts, prices, and customer service in the automotive aftermarket industry. We have a rich culture and history of going the Extra Mile for our customers and our community. At AutoZone, you're not just doing a job; you're playing a crucial role in creating a better experience for our customers, while creating opportunities to DRIVE YOUR CAREER almost anywhere! We are looking for talented people who are customer-focused, enjoy helping others, and have the DRIVE to excel in a fast-paced environment! Position Summary: The Systems Engineer will design data model solutions and ensure alignment between business and IT strategies, operating models, guiding principles, and software development with a focus on the information layer. The Systems Engineer works across business lines and IT domains to ensure that information is viewed as a corporate asset. This includes its proper data definition, creation, usage, archival, and governance. The Systems Engineer works with other engineers and Data Architects to design overall solutions in accordance with industry best practices, principles and standards. The Systems Engineer strives to create and improve the quality of systems, provide more flexible solutions, and reduce time-to-market. Key Responsibilities: - Enhance and maintain the AutoZone information strategy. - Ensure alignment of programs and projects with the strategic AZ Information Roadmap and related strategies. - Perform gap analysis between current data structures and target data structures. - Enhance and maintain the Enterprise Information Model. - Work with service architects and application architects to assist with the creation of proper data access and utilization methods. - Gather complex business requirements and translate product and project needs into data models supporting long-term solutions. - Serve as a technical data strategy expert and lead the creation of technical requirements and design deliverables. - Define and communicate data standards, industry best practices, technologies, and architectures. - Check conformance to standards and resolve any conflicts by explaining and justifying architectural decisions. - Recommend and evaluate new tools and methodologies as needed. - Manage, communicate, and improve the data governance framework. Requirements: - A systems thinker, able to move fluidly between high-level abstract thinking and detail-oriented implementation, open-minded to new ideas, approaches, and technologies. - A data and fact-driven decision-maker, with an ability to make quick decisions under uncertainty when necessary; able to quickly learn new technologies, tools, and organizational structures/strategies. - Understanding of current industry standard best practices regarding integration, architecture, tools, and processes. - A self-starter that is naturally inquisitive, requiring only small pieces to the puzzle, across many technologies - new and legacy. - Excellent written and verbal communication, presentation, and analytical skills, including the ability to effectively communicate complex technical concepts and designs to a broad range of people. Education and/or Experience: - Bachelor's degree in MIS, Computer Science or similar degree or experience required. - Minimum 3+ years of experience and knowledge of database systems such as Oracle, Postgres, UDB/DB2, BigQuery, Spanner, JSON, and Couchbase. - Minimum 2 years of experience with data requirements gathering, acquisition of data from different business systems, ingestion of data in GCP using managed services namely BigQuery, DataFlow, Composer, Pub/Sub, and other ingestion technologies, curation of the data using DBT or other similar technologies, and creating data marts/wide tables for analysis and reporting consumption. - Assembling large, complex sets of data that meet non-functional and functional business requirements. - Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. - Building required infrastructure for optimal extraction, transformation, and loading of data from various data sources using GCP and SQL technologies. - Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. - Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues. - Working with stakeholders including the Executive, Product, Data, and Design teams to support their data infrastructure needs while assisting with data-related technical issues. - Relational & NoSQL database design capability across OLTP & OLAP. - Excellent analytical and problem-solving skills. - Excellent verbal and written communication skills. - Ability to facilitate modeling sessions and communicate appropriately with IT and business customers. - Experience with Agile software development methodologies. - Experience with large-replicated databases across distributed and cloud data centers. Our Values: An AutoZoner Always. - PUTS CUSTOMERS FIRST - CARES ABOUT PEOPLE - STRIVES FOR EXCEPTIONAL PERFORMANCE - ENERGIZES OTHERS - EMBRACES DIVERSITY - HELPS TEAMS SUCCEED,
Posted 6 days ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
About AutoZone: AutoZone is the nation's leading retailer and a leading distributor of automotive replacement parts and accessories with more than 6,000 stores in US, Puerto Rico, Mexico, and Brazil. Each store carries an extensive line for cars, sport utility vehicles, vans and light trucks, including new and remanufactured hard parts, maintenance items and accessories. We also sell automotive diagnostic and repair software through ALLDATA, diagnostic and repair information through ALLDATAdiy.com, automotive accessories through AutoAnything.com and auto and light truck parts and accessories through AutoZone.com. Since opening its first store in Forrest City, Ark. on July 4, 1979, the company has joined the New York Stock Exchange (NYSE: AZO) and earned a spot in the Fortune 500. AutoZone has been committed to providing the best parts, prices, and customer service in the automotive aftermarket industry. We have a rich culture and history of going the Extra Mile for our customers and our community. At AutoZone you're not just doing a job; you're playing a crucial role in creating a better experience for our customers, while creating opportunities to DRIVE YOUR CAREER almost anywhere! We are looking for talented people who are customer-focused, enjoy helping others and have the DRIVE to excel in a fast-paced environment! Position Summary: The Systems Engineer will design data model solutions and ensure alignment between business and IT strategies, operating models, guiding principles, and software development with a focus on the information layer. The Systems Engineer works across business lines and IT domains to ensure that information is viewed as a corporate asset. This includes its proper data definition, creation, usage, archival, and governance. The Systems Engineer works with other engineers and Data Architects to design overall solutions in accordance with industry best practices, principles and standards. The Systems Engineer strives to create and improve the quality of systems, provide more flexible solutions, and reduce time-to-market. Key Responsibilities: - Enhance and maintain the AutoZone information strategy. - Ensure alignment of programs and projects with the strategic AZ Information Roadmap and related strategies. - Perform gap analysis between current data structures and target data structures. - Enhance and maintain the Enterprise Information Model. - Work with service architects and application architects to assist with the creation of proper data access and utilization methods. - Gather complex business requirements and translate product and project needs into data models supporting long-term solutions. - Serve as a technical data strategy expert and lead the creation of technical requirements and design deliverables. - Define and communicate data standards, industry best practices, technologies, and architectures. - Check conformance to standards and resolve any conflicts by explaining and justifying architectural decisions. - Recommend and evaluate new tools and methodologies as needed. - Manage, communicate, and improve the data governance framework. Requirements: - A systems thinker, able to move fluidly between high-level abstract thinking and detail-oriented implementation, open-minded to new ideas, approaches, and technologies. - A data and fact-driven decision-maker, with an ability to make quick decisions under uncertainty when necessary; able to quickly learn new technologies, tools, and organizational structures/strategies. - Understanding of current industry standard best practices regarding integration, architecture, tools, and processes. - A self-starter that is naturally inquisitive, requiring only small pieces to the puzzle, across many technologies new and legacy. - Excellent written and verbal communication, presentation, and analytical skills, including the ability to effectively communicate complex technical concepts and designs to a broad range of people. Education and/or Experience: - Bachelor's degree in MIS, Computer Science or similar degree or experience required. - Minimum 3+ years of experience and knowledge of database systems such as Oracle, Postgres, UDB/DB2, BigQuery, Spanner, JSON, and Couchbase. - Minimum 2 years of experience with data requirements gathering, acquisition of data from different business systems, ingestion of data in GCP using managed services namely BigQuery, DataFlow, Composer, Pub/Sub and other ingestion technologies, curation of the data using DBT or other similar technologies and creating data marts/wide tables for analysis and reporting consumption. - Assembling large, complex sets of data that meet non-functional and functional business requirements. - Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. - Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using GCP and SQL technologies. - Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. - Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues. - Working with stakeholders including the Executive, Product, Data, and Design teams to support their data infrastructure needs while assisting with data-related technical issues. - Relational & NoSQL database design capability across OLTP & OLAP. - Excellent analytical and problem-solving skills. - Excellent verbal and written communication skills. - Ability to facilitate modeling sessions and communicate appropriately with IT and business customers. - Experience with Agile software development methodologies. - Experience with large-replicated databases across distributed and cloud data centers. Our Values: An AutoZoner Always. - PUTS CUSTOMERS FIRST - CARES ABOUT PEOPLE - STRIVES FOR EXCEPTIONAL PERFORMANCE - ENERGIZES OTHERS - EMBRACES DIVERSITY - HELPS TEAMS SUCCEED,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
karnataka
On-site
The ideal candidate for this role should hold a Bachelor's degree in Computer Science or a related technical field, or possess equivalent practical experience. Additionally, the candidate should have at least 8 years of experience in software development using one or more programming languages such as Python, C, C++, or Java. In addition to this, the candidate should have a minimum of 3 years of experience in a technical leadership role, overseeing projects, with at least 2 years of experience in people management or team leadership positions. Preferred qualifications for this role include experience in building and deploying large-scale systems, familiarity with Distributed Processing and Infrastructure Design or Ads Infrastructure, as well as experience in C++, Spanner, or CNS. A strong understanding of software fundamentals, distributed systems, and large-scale data processing is also highly desirable. As a Software Engineering Manager at Google, you will play a crucial role in providing technical leadership to major projects while managing a team of Engineers. In this position, you will not only optimize your own code but also ensure that Engineers under your supervision can optimize theirs. Your responsibilities will include managing project goals, contributing to product strategy, and assisting in the development of your team. You will be responsible for leading and growing a team of 6-8 engineers in India, focused on designing, developing, and deploying a new platform. As a leader, you will define the technical roadmap for the platform's development, collaborating with technical leads and product managers to drive execution from concept to deployment. Furthermore, you will mentor and lead a High-Performing Team, creating a collaborative and supportive environment where team members can thrive. Additionally, you will contribute technically to the development process, particularly in areas such as user journeys mapping to technical workflow, API integration with sister teams, backend integration, and social media platform data storage and correlation. Collaboration across teams and engagement with global experts will be essential, contributing to cutting-edge data-driven decision-making processes.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us, our culture differentiates us, and our strategy drives us. At UPS, we are customer first, people-led, and innovation-driven. UPS's India-based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and engineer technology solutions that improve our competitive advantage in the field of Logistics. As a visible and valued Technology professional with UPS, you will drive us towards an exciting tomorrow. If you are solutions-oriented, UPS Technology is the place for you. You will deliver ground-breaking solutions to some of the biggest logistics challenges globally and make a significant difference for UPS and our customers. This position provides input and support for full systems lifecycle management activities such as analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc. You will collaborate with teams to ensure effective communication and support the achievement of objectives. Additionally, you will provide knowledge, development, maintenance, and support for applications. Responsibilities include generating application documentation, contributing to systems analysis and design, designing and developing moderately complex applications, contributing to integration builds, contributing to maintenance and support, and monitoring emerging technologies and products. Qualifications include a Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or a related field. Knowledge in Java, Spring Boot, strong in Google Cloud, and exposure to Spanner are required. This is a permanent position at UPS, committed to providing a workplace free of discrimination, harassment, and retaliation.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer at our company, you will collaborate with a world-class team to drive the telecom business to its full potential. We are focused on building data products for telecom wireless and wireline business segments, including consumer analytics and telecom network performance. Our projects involve cutting-edge technologies like digital twin to develop analytical platforms and support AI and ML implementations. Your responsibilities will include working closely with business product owners, data scientists, and system architects to develop strategic data solutions from various sources such as batch, file, and data streams. You will be utilizing your expertise in ETL/ELT processes to integrate structured and unstructured data into our data warehouse and data lake for real-time streaming and batch processing, enabling data-driven insights and analytics for business teams within Verizon. Key Responsibilities: - Understanding business requirements and translating them into technical designs. - Data ingestion, preparation, and transformation. - Developing data streaming applications. - Resolving production failures and identifying solutions. - Working on ETL/ELT development. - Contributing to DevOps pipelines and understanding the DevOps process. Qualifications: - Bachelor's degree or four or more years of work experience. - Four or more years of relevant work experience. - Proficiency in Data Warehouse concepts and Data Management lifecycle. - Experience with Big Data technologies such as GCP, Hadoop, Spark, Composer, DataFlow, and BigQuery. - Strong skills in complex SQL. - Hands-on experience with streaming ETL pipelines. - Expertise in Java programming. - Familiarity with MemoryStore, Redis, and Spanner. - Ability to troubleshoot data issues. - Knowledge of data pipeline and workflow management tools. - Understanding of Information Systems and their applications in data management. Preferred Qualifications: - Three or more years of relevant experience. - Certification in ETL/ELT development or GCP-Data Engineer. - Strong attention to detail and accuracy. - Excellent problem-solving, analytical, and research skills. - Effective verbal and written communication abilities. - Experience in presenting to and influencing stakeholders. - Previous experience in leading a small technical team for project delivery. If you are passionate about new technologies and enjoy applying your technical expertise to solve business challenges, we encourage you to apply for this exciting opportunity to work on innovative data projects in the telecom industry.,
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Very good Understanding of current work and the tools and technologies being used. Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python. Experience working with Fact and Dimension tables, SCD. Minimum 3 years experience in GCP Data Engineering. Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark,SQL. GCS(Cloud Storage), Composer (Airflow) and BigQuery experience. Should have worked on handling big data. Your Profile Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Pipeline development experience using Dataflow or Dataproc (Apache Beam etc). Any other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions etc. Proven analytical skills and Problem-solving attitude. Excellent Communication Skills. What you'll love about working here .You can shape yourwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. .You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. .You will have theon one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 2 weeks ago
7.0 - 12.0 years
15 - 27 Lacs
chennai
Hybrid
Senior Bigdata Developer (GCP - BigQuery , DataFlow, DataProc, Spanner) Very good communication skill Self starter and learner Willing to work from office on Hybrid mode.
Posted 3 weeks ago
7.0 - 12.0 years
18 - 27 Lacs
chennai
Hybrid
Role & responsibilities Strong 7+ years experience in Bigdata, GCP - BQ, , DataFlow, DataProc, Spanner Good knowledge/Exp in SQL Very good communication skill Self starter and learner Willing to work from office on Hybrid mode. Preferred candidate profile
Posted 3 weeks ago
7.0 - 9.0 years
32 - 35 Lacs
hyderabad, chennai, bengaluru
Work from Office
Job Description: We are looking for an experienced GCP Data Engineer to join our team. The ideal candidate will have strong expertise in designing and developing scalable data pipelines and real-time processing solutions on Google Cloud Platform. Key Skills Required: Hands-on experience in GCP Dataflow and Realtime Data Processing Strong programming skills in Java Expertise in Spanner and BigQuery (BQ) Solid understanding of data modeling, ETL processes, and performance optimization Ability to work in a fast-paced, collaborative environment Preferred Qualifications: Strong problem-solving and analytical skills Excellent communication and teamwork abilities Prior experience in handling large-scale data systems
Posted 3 weeks ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
As a software engineer at Google, you will have the opportunity to work on cutting-edge technologies that impact billions of users worldwide. Our projects involve handling massive amounts of information and go beyond traditional web search. We are seeking individuals with innovative ideas in areas such as distributed computing, information retrieval, system design, networking, security, artificial intelligence, UI design, and more. In this role, you will be part of the Play Developer Console team, responsible for various initiatives within the Play and Android ecosystem. Your responsibilities will include working on projects to enhance the developer experience, improve app ecosystem safety and quality, and collaborate with teams across different locations. You will work with technologies such as Java, Boq, Apps Framework, Stubby, Spanner, Dart, and ACX. Google Play offers a wide range of digital content such as music, movies, books, apps, and games that sync seamlessly across devices. As a member of the Android and Mobile team working on Google Play, you will contribute to backend systems engineering, product strategy, and content partnerships to provide users with a seamless experience across their devices. Your responsibilities will include writing code for product or system development, participating in design reviews, reviewing code from peers, contributing to documentation, triaging and resolving system issues, and more. We are looking for versatile engineers who demonstrate leadership qualities and enthusiasm for tackling new challenges across the full technology stack. If you have a Bachelor's degree or equivalent practical experience, at least 5 years of software development experience in multiple programming languages, and a year of experience in software design and architecture, we encourage you to apply. Preferred qualifications include 5 years of experience with data structures or algorithms, as well as familiarity with Dart, Apps Framework, Boq, and Spanner. Join us in shaping the future of technology and making an impact on a global scale with Google.,
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler un emploi, slectionnez votre langue de prfrence parmi les options disponibles en haut droite de cette page. Dcouvrez votre prochaine opportunit au sein d&aposune organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunits innovantes, dcouvrez notre culture enrichissante et travaillez avec des quipes talentueuses qui vous poussent vous dvelopper chaque jour. Nous savons ce quil faut faire pour diriger UPS vers l&aposavenir : des personnes passionnes dotes dune combinaison unique de comptences. Si vous avez les qualits, de la motivation, de l&aposautonomie ou le leadership pour diriger des quipes, il existe des postes adapts vos aspirations et vos comptences d&aposaujourd&aposhui et de demain. Fiche De Poste Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPSs India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics. Future You grows as a visible and valued Technology professional with UPS, driving us towards an exciting tomorrow. As a global Technology organization we can put serious resources behind your development. If you are solutions orientated, UPS Technology is the place for you. Future You delivers ground-breaking solutions to some of the biggest logistics challenges around the globe. Youll take technology to unimaginable places and really make a difference for UPS and our customers. Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities Senior App Developer with minimum 5+ years experience developing applications Experience creating REST services (APIs) keeping microservice design patterns in mind Familiarity with Spanner / SQL Server Experience creating integration services to consume/process data from other systems Familiarity with GCP PubSub / AMQP is helpful needed Able to create CI/CD pipeline for the above services (Jenkins / Terraform) Able to create relevant documentation for each of the services Perform design reviews and code reviews Experience providing real time knowledge transfer to UPS team Establish UPS best practices in design/coding/testing Provide best practices for performance tuning Familiarity with testing, automation, and BDD testing frameworks is desired as well. Provide best practices for distributed logging and aggregating them to have appropriate instrumentation for all services Develop microservices keeping in mind best practices for cloud native applications (GCP GKE / OpenShift) Qualifications Bachelors Degree or International equivalent Bachelor&aposs Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred Type De Contrat en CDI Chez UPS, galit des chances, traitement quitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachs. Show more Show less
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPSs India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics. Future You grows as a visible and valued Technology professional with UPS, driving us towards an exciting tomorrow. As a global Technology organization we can put serious resources behind your development. If you are solutions orientated, UPS Technology is the place for you. Future You delivers ground-breaking solutions to some of the biggest logistics challenges around the globe. Youll take technology to unimaginable places and really make a difference for UPS and our customers. Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities Senior App Developer with minimum 5+ years experience developing applications Experience creating REST services (APIs) keeping microservice design patterns in mind Familiarity with Spanner / SQL Server Experience creating integration services to consume/process data from other systems Familiarity with GCP PubSub / AMQP is helpful needed Able to create CI/CD pipeline for the above services (Jenkins / Terraform) Able to create relevant documentation for each of the services Perform design reviews and code reviews Experience providing real time knowledge transfer to UPS team Establish UPS best practices in design/coding/testing Provide best practices for performance tuning Familiarity with testing, automation, and BDD testing frameworks is desired as well. Provide best practices for distributed logging and aggregating them to have appropriate instrumentation for all services Develop microservices keeping in mind best practices for cloud native applications (GCP GKE / OpenShift) Qualifications Bachelors Degree or International equivalent Bachelor&aposs Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 1 month ago
7.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Description Looking for an experienced GCP Cloud/DevOps Engineer and or OpenShift to design, implement, and manage cloud infrastructure and services across multiple environments. This role requires deep expertise in Google Cloud Platform (GCP) services, DevOps practices, and Infrastructure as Code (IaC). Candidate will be deploying, automating, and maintaining high-availability systems, and implementing best practices for cloud architecture, security, and DevOps pipelines. Requirements Bachelor&aposs or master&aposs degree in computer science, Information Technology, or a similar field Must have 7 + years of extensive experience in designing, implementing, and maintaining applications on GCP and OpenShift Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team Key Responsibilities Cloud Infrastructure Design and Deployment: Architect, design, and implement scalable, reliable, and secure solutions on GCP. Deploy and manage GCP services in both development and production environments, ensuring seamless integration with existing infrastructure. Implement and manage core services such as BigQuery, Datafusion, Cloud Composer (Airflow), Cloud Storage, Data Fusion, Compute Engine, App Engine, Cloud Functions and more. Infrastructure as Code (IaC) and Automation Develop and maintain infrastructure as code using Terraform or CLI scripts to automate provisioning and configuration of GCP resources. Establish and document best practices for IaC to ensure consistent and efficient deployments across environments. DevOps And CI/CD Pipeline Development Create and manage DevOps pipelines for automated build, test, and release management, integrating with tools such as Jenkins, GitLab CI/CD, or equivalent. Work with development and operations teams to optimize deployment workflows, manage application dependencies, and improve delivery speed. Security And IAM Management Handle user and service account management in Google Cloud IAM. Set up and manage Secrets Manager and Cloud Key Management for secure storage of credentials and sensitive information. Implement network and data security best practices to ensure compliance and security of cloud resources. Performance Monitoring And Optimization Monitoring & Security: Set up observability tools like Prometheus, Grafana, and integrate security tools (e.g., SonarQube, Trivy). Networking & Storage: Configure DNS, networking, and persistent storage solutions in Kubernetes. Set up monitoring and logging (e.g., Cloud Monitoring, Cloud Logging, Error Reporting) to ensure systems perform optimally. Troubleshoot and resolve issues related to cloud services and infrastructure as they arise. Workflow Orchestration Orchestrate complex workflows using Argo Workflow Engine. Containerization: Work extensively with Docker for containerization and image management. Optimization: Troubleshoot and optimize containerized applications for performance and security. Technical Skills Expertise with GCP and OCP (OpenShift) services, including but not limited to Compute Engine, Kubernetes Engine (GKE), BigQuery, Cloud Storage, Pub/Sub, Datafusion, Airflow, Cloud Functions, and Cloud SQL. Proficiency in scripting languages like Python, Bash, or PowerShell for automation. Familiarity with DevOps tools and CI/CD processes (e.g. GitLab CI, Cloud Build, Azure DevOps, Jenkins) Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Backend Engineer with 5 - 7 years of experience, you will play a vital role in designing, developing, and optimizing backend services and microservices. Your expertise in Java, Spring Boot, SQL databases, and cloud platforms will be crucial in creating scalable and reliable solutions that drive our applications. Your main responsibilities will include designing and developing Java-based backend services and microservices using Spring Boot. You will collaborate with cross-functional teams to comprehend business requirements and translate them into technical solutions. Writing efficient and maintainable code that adheres to high-quality standards will be essential, along with optimizing existing code for performance enhancement. Managing SQL queries and database schema designs, implementing CI/CD pipelines using Jenkins and BitBucket, and testing and debugging applications using tools like Postman and your preferred IDE will also be part of your daily tasks. To deploy and manage services, you will utilize cloud platforms such as Google Kubernetes Engine (GKE), Spanner, BigQuery, Redis, and MongoDB. Working closely with front-end developers and architects to ensure smooth integration of services and mentoring junior developers on best practices and coding standards will also be part of your role. Collaborating with DevOps teams to guarantee the reliability and scalability of backend services will further enhance the efficiency of our operations. To be successful in this role, you should hold a Bachelor's degree in computer science, engineering, or a related field (Master's degree preferred) and possess a minimum of 5 years of hands-on experience in backend development using Java. Strong expertise in Java, Spring Boot, and microservices architecture is crucial, along with experience in any cloud platform (AWS/GCP/Azure). Proficiency in SQL database design, optimization, and querying is necessary, as well as experience with CI/CD using Jenkins and BitBucket. Familiarity with API testing and debugging tools like Postman, proficiency in using your preferred IDE, and knowledge of cloud platforms such as GKE, Spanner, BigQuery, Redis, and MongoDB will be advantageous. Strong problem-solving skills, attention to detail, excellent communication, and collaboration abilities are essential attributes for this role. The capacity to thrive in a fast-paced and dynamic environment will further contribute to your success in this position.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
You will be working as an AI Platform Engineer in Bangalore as part of the GenAI COE Team. Your key responsibilities will involve developing and promoting scalable AI platforms for customer-facing applications. It will be essential to evangelize the platform with customers and internal stakeholders, ensuring scalability, reliability, and performance to meet business needs. Your role will also entail designing machine learning pipelines for experiment management, model management, feature management, and model retraining. Implementing A/B testing of models and designing APIs for model inferencing at scale will be crucial. You should have proven expertise with MLflow, SageMaker, Vertex AI, and Azure AI. As an AI Platform Engineer, you will serve as a subject matter expert in LLM serving paradigms, with in-depth knowledge of GPU architectures. Expertise in distributed training and serving of large language models, along with proficiency in model and data parallel training using frameworks like DeepSpeed and service frameworks like vLLM, will be required. Demonstrating proven expertise in model fine-tuning and optimization techniques to achieve better latencies and accuracies in model results will be part of your responsibilities. Reducing training and resource requirements for fine-tuning LLM and LVM models will also be essential. Having extensive knowledge of different LLM models and providing insights on their applicability based on use cases is crucial. You should have proven experience in delivering end-to-end solutions from engineering to production for specific customer use cases. Your proficiency in DevOps and LLMOps practices, along with knowledge of Kubernetes, Docker, and container orchestration, will be necessary. A deep understanding of LLM orchestration frameworks such as Flowise, Langflow, and Langgraph is also required. In terms of skills, you should be familiar with LLM models like Hugging Face OSS LLMs, GPT, Gemini, Claude, Mixtral, and Llama, as well as LLM Ops tools like ML Flow, Langchain, Langraph, LangFlow, Flowise, LLamaIndex, SageMaker, AWS Bedrock, Vertex AI, and Azure AI. Additionally, knowledge of databases/data warehouse systems like DynamoDB, Cosmos, MongoDB, RDS, MySQL, PostGreSQL, Aurora, and Google BigQuery, as well as cloud platforms such as AWS, Azure, and GCP, is essential. Proficiency in DevOps tools like Kubernetes, Docker, FluentD, Kibana, Grafana, and Prometheus, along with cloud certifications like AWS Professional Solution Architect and Azure Solutions Architect Expert, will be beneficial. Strong programming skills in Python, SQL, and Javascript are required for this full-time role, with an in-person work location.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow - people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. This position provides leadership in full systems life cycle management to ensure delivery is on time and within budget. You will be responsible for directing component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. Additionally, you will develop and lead AD project activities and integrations, guide teams to ensure effective communication and achievement of objectives, research and support the integration of emerging technologies, and provide knowledge and support for applications development, integration, and maintenance. Leading junior team members with project-related activities and tasks, guiding and influencing department and project teams, and facilitating collaboration with stakeholders are also key aspects of this role. Primary Skills: - C#, .Net - Web-Development (Angular) - Database(SQL/PLSQL) - ReST API (Micro/Webservices) - Cloud apps experience & DevOps(CI/CD) Secondary skills: - Google Cloud Platform - GKE, Apigee, BigQuery, Spanner etc. - Agile experience - PowerBI - Experience in Java apps Responsibilities: - Leads systems analysis and design. - Leads design and development of applications. - Develops and ensures creation of application documents. - Defines and produces integration builds. - Monitors emerging technology trends. - Leads maintenance and support. Qualifications: - Bachelors Degree or International equivalent - Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred Employee Type: Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.,
Posted 1 month ago
0.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPSs India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics. Future You grows as a visible and valued Technology professional with UPS, driving us towards an exciting tomorrow. As a global Technology organization we can put serious resources behind your development. If you are solutions orientated, UPS Technology is the place for you. Future You delivers ground-breaking solutions to some of the biggest logistics challenges around the globe. Youll take technology to unimaginable places and really make a difference for UPS and our customers. Job Summary This position provides input and support for full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides knowledge, development, maintenance, and support for applications. Should pocess knowledge in Java, Spring boot, strong in Google Cloud and exposure to Spanner. Responsibilities Generates application documentation. Contributes to systems analysis and design. Designs and develops moderately complex applications. Contributes to integration builds. Contributes to maintenance and support. Monitors emerging technologies and products. Qualifications Bachelors Degree or International equivalent Bachelor&aposs Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
kolkata, west bengal
On-site
As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
You should have a Bachelor's degree or equivalent practical experience, along with 5 years of software development experience in one or more programming languages. Additionally, you should possess 3 years of experience in testing, maintaining, or launching software products, and at least 1 year of experience in software design and architecture. Moreover, you should have 3 years of experience in developing large-scale infrastructure, distributed systems or networks, or experience with compute technologies, storage, or hardware architecture. Preferred qualifications include 4 years of experience in data structures and algorithms, 1 year of experience in a technical leadership role, and expertise in C++, Backend, Spanner, Flume, and SQL. Experience in developing accessible technologies is also desirable. As a software engineer at Google, you will contribute to developing next-generation technologies that impact billions of users worldwide. The role involves working on projects critical to Google's needs, with opportunities to switch teams and projects as the business evolves. We are looking for engineers who are versatile, display leadership qualities, and are enthusiastic about tackling new challenges across the full-stack to drive technology innovation forward. The Core team at Google builds the technical foundation for flagship products. As an engineer on the Core team, you will be responsible for the underlying design elements, developer platforms, product components, and infrastructure at Google. You will work to create excellent, safe, and coherent user experiences, drive innovation, and break down technical barriers across products. Your responsibilities will include managing services for User Media to enable new content features, infrastructure work to build reliable and secure systems, collaborating with product teams, and participating in a 24/7 tier-1/2 on-call rotation.,
Posted 1 month ago
7.0 - 10.0 years
1 - 6 Lacs
Chennai
Work from Office
Key Responsibilities Design and develop large-scale data pipelines using GCP services (BigQuery, Dataflow, Dataproc, Pub/Sub). Implement batch and real-time ETL/ELT pipelines using Apache Beam and Spark. Manage and optimize BigQuery queries, partitioning, clustering, and cost control. Build distributed processing jobs on Dataproc (Hadoop/Spark) clusters. Develop and maintain streaming data pipelines with Pub/Sub and Dataflow. Work with Cloud Spanner to support highly available and globally scalable databases. Integrate data from various sources, manage schema evolution, and ensure data quality. Collaborate with data analysts, data scientists, and business teams to deliver scalable data solutions. Follow CI/CD , DevOps, and infrastructure-as-code best practices using tools like Terraform or Cloud Build . Monitor, debug, and tune data pipelines for performance and reliability. Must-Have Skills GCP expertise BigQuery, Dataflow, Dataproc, Cloud Spanner, Pub/Sub. Strong SQL skills and performance optimization in BigQuery. Solid experience in streaming (real-time) and batch processing . Proficiency in Apache Beam , Apache Spark , or similar frameworks. Python or Java for data processing logic. Understanding of data architecture , pipeline design patterns, and distributed systems . Experience with IAM roles , service accounts , and GCP security best practices. Familiarity with monitoring tools – Stackdriver, Dataflow job metrics, BQ Query Plans.
Posted 1 month ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in database management. The ideal candidate should be passionate about technology, dedicated to continuous learning, and committed to providing exceptional customer experiences through client interactions. Qualifications: - Must have a degree in BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or related fields. - Expertise and hands-on experience in PostgreSQL, PLSQL, Oracle, query optimization, performance tuning, and GCP Cloud. Job Description: The responsibilities of the PostgreSQL Database Developer include: - Proficient in PL/SQL and PostgreSQL programming, with the ability to write complex SQL queries and stored procedures. - Experience in migrating database structure and data from Oracle to Postgres SQL, preferably on GCP Alloy DB or Cloud SQL. - Familiarity with Cloud SQL/Alloy DB and tuning them for better performance. - Working knowledge of Big Query, Fire Store, Memory Store, Spanner, and bare metal setup for PostgreSQL. - Expertise in tuning Alloy DB/Cloud SQL database for optimal performance. - Experience with GCP Data migration service, MongoDB, Cloud Dataflow, Disaster Recovery, job scheduling, logging techniques, and OLTP/OLAP. - Desirable: GCP Database Engineer Certification. Roles & Responsibilities: - Develop, test, and maintain data architectures. - Migrate Enterprise Oracle database from On-Prem to GCP cloud, focusing on autovacuum in PostgreSQL. - Tuning autovacuum in PostgreSQL. - Performance tuning of PostgreSQL stored procedures and queries. - Convert Oracle stored procedures and queries to PostgreSQL equivalents. - Create a hybrid data store with Datawarehouse, NoSQL GCP solutions, and PostgreSQL. - Migrate Oracle table data to Alloy DB. - Lead the database team. Mandatory Skills: PostgreSQL, PLSQL, Bigquery, GCP Cloud, tuning, and optimization. To apply, please share your resume at sonali.mangore@impetus.com with details of your current CTC, expected CTC, notice period, and Last Working Day (LWD).,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. For the role at 66degrees, we are seeking a senior contractor to engage in a 2.5-month remote assignment with the potential to extend. Candidates with the required skills and the ability to work independently as well as within a team environment are encouraged to apply. As part of the responsibilities, you will be expected to facilitate, guide, and influence the client and teams towards an effective architectural pattern. You will serve as an interface between business leadership, technology leadership, and the delivery teams. Your role will involve performing Migration Assessments and producing Migration Plans that include Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, and Application Waves. Additionally, you will be responsible for designing a solution architecture on Google Cloud to support critical workloads. This will include Heterogeneous Oracle Migrations to Postgres or Spanner. You will need to design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users and Security. You will oversee migration activities and provide troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, and Technology reviews and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy. Your personal information is collected, used, and shared in accordance with the California Consumer Privacy Act (CCPA).,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |