Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do Let’s do this. Let’s change the world. In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with R&D Business SMEs, Data Engineers, Data Scientists and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Data & AI Product Teams Become a R&D domain authority in Data & AI technology capabilities by researching, deploying, and sustaining features built according to Amgen’s Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with Business SME’s, Data Scientists, ML Engineers to understand the requirements around Data product requirements, KPI’s etc. Analyzing the source systems and create the STTM documents. Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Principal IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: Master’s degree with 8 - 12 years of experience in R&D Informatics Bachelor’s degree with 10 - 14 years of experience in R&D Informatics Diploma with 14 - 18 years of experience in R&D Informatics Mandatory work experience in acting as a business analyst in DWH, Data product building, BI & Analytics Applications. Experience in Analyzing the requirements of BI, AI & Analytics applications and working with Data Source SME, Data Owners to identify the data sources and data flows Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
You should have 5+ years of experience in core Java and the Spring Framework. Additionally, you must have at least 2 years of experience in Cloud technologies such as GCP, AWS, or Azure, with a preference for GCP. It is required to have experience in big data processing on a distributed system and in working with databases including RDBMS, NoSQL databases, and Cloud natives. You should also have expertise in handling various data formats like Flat file, JSON, Avro, XML, etc., including defining schemas and contracts. Furthermore, you should have experience in implementing data pipelines (ETL) using Dataflow (Apache Beam) and in working with Microservices and integration patterns of APIs with data processing. Experience in data structure, defining, and designing data models will be beneficial for this role.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 1 week ago
5.0 years
0 Lacs
Madurai, Tamil Nadu, India
On-site
We are seeking a hands-on GCP Data Engineer/Lead/Architect with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam) , and BigQuery . Tasks Key Responsibilities Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub , Dataflow , and BigQuery . Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. Work closely with stakeholders to understand data requirements and translate them into scalable designs. Optimize streaming pipeline performance, latency, and throughput. Build and manage orchestration workflows using Cloud Composer (Airflow) . Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring , Error Reporting , and Stackdriver . Experience with the data modeling. Ensure robust security, encryption, and access controls across all data layers. Collaborate with DevOps for CI/CD automation of data workflows using Terraform , Cloud Build , and Git . Document streaming architecture, data lineage, and deployment runbooks. Requirements Required Skills & Experience 5+ years of experience in data engineering or architecture. 3+ years of hands-on GCP data engineering experience. Strong expertise in: Google Pub/Sub Dataflow (Apache Beam) BigQuery (including streaming inserts) Cloud Composer (Airflow) Cloud Storage (GCS) Solid understanding of streaming design patterns , exactly-once delivery , and event-driven architecture . Deep knowledge of SQL and NoSQL data modeling. Hands-on experience with monitoring and performance tuning of streaming jobs. Experience using Terraform or equivalent for infrastructure as code. Familiarity with CI/CD pipelines for data workflows. Benefits Benefits Great work takes place when people feel appreciated and supported. That’s why Techmango believes in shaping an environment where you can bloom, stay inspired, and benefit from a healthy work-life balance. Because you’re not just an employee but a part of something greater. Badminton Ground Free Accommodation Cab Facility for Female Employees Insurance GYM Subsidised Food Awards and Recognition Medical Checkup Join TechMango as a GCP Data Engineer to innovate and drive data excellence. Leverage cutting-edge technology in a dynamic team environment. Apply now for an exciting career opportunity!
Posted 1 week ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. Zoominfo is a rapidly growing data-driven company, and as such- we understand the importance of a comprehensive and solid data solution to support decision making in our organization. Our vision is to have a consistent, democratized, and accessible single source of truth for all company data analytics and reporting. Our goal is to improve decision-making processes by having the right information available when it is needed. As a Principal Software Engineer in our Data Platform infrastructure team you'll have a key role in building and designing the strategy of our Enterprise Data Engineering group. What You'll do: Design and build a highly scalable data platform to support data pipelines for diversified and complex data flows. Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and POC activities. Deliver scalable, reliable and reusable data solutions. Leading, building and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms. Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs. Develop processes and tools to monitor, analyze, maintain and improve data operation, performance and usability. What you bring: Relevant Bachelor degree or other equivalent Software Engineering background. 12+ years of experience as an infrastructure / data platform / big data software engineer. Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena. IaC design and hands-on experience. Familiarity designing CI/CD pipelines with Jenkins, Github Actions, or similar tools. Experience in designing, building and maintaining enterprise systems in a big data environment on public cloud. Strong SQL abilities and hands-on experience with SQL, performing analysis and performance optimizations. Hands-on experience in Python or equivalent programming language. Experience with administering data warehouse solutions (like Bigquery/ Redshift/ Snowflake). Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance. Experience with Airflow and DBT - advantage Experience with Kubernetes using GKE or EKS - advantage.. Experience with development practices – Agile, TDD - advantage. About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller. ZoomInfo may use a software-based assessment as part of the recruitment process. More information about this tool, including the results of the most recent bias audit, is available here. ZoomInfo is proud to be an equal opportunity employer, hiring based on qualifications, merit, and business needs, and does not discriminate based on protected status. We welcome all applicants and are committed to providing equal employment opportunities regardless of sex, race, age, color, national origin, sexual orientation, gender identity, marital status, disability status, religion, protected military or veteran status, medical condition, or any other characteristic protected by applicable law. We also consider qualified candidates with criminal histories in accordance with legal requirements. For Massachusetts Applicants: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. ZoomInfo does not administer lie detector tests to applicants in any location.
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Company: Our client is a global technology consulting and digital solutions company that enables enterprises to reimagine business models and accelerate innovation through digital technologies. Powered by more than 84,000 entrepreneurial professionals across more than 30 countries, it caters to over 700 clients with its extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes. Job Title: Java Developer with GCP Location : Pune (Shivajinagar) Experience : 6 to 8 Years Employment Type : Contract to Hire Work Mode : Hybrid Notice Period : Immediate Joiners Only Job Description: Bachelors in computer science, Engineering, or equivalent experience 7+ years of experience in core JAVA, Spring Framework (Required) 2 years of Cloud experience (GCP, AWS, Azure, GCP preferred ) (Required) Experience in big data processing, on a distributed system. (required) Experience in databases RDBMS, NoSQL databases Cloud natives. (Required) Experience in handling various data formats like Flat file, jSON, Avro, xml etc with defining the schemas and the contracts. (required) Experience in implementing the data pipeline (ETL) using Dataflow( Apache beam) Experience in Microservices and integration patterns of the APIs with data processing. Experience in data structure, defining and designing the data models
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Required Skills: 3 + years of hands-on experience in data modeling, ETL processes, developing reporting systems and data engineering (e.g., ETL, Big Query, SQL, Python or Alteryx) Advanced knowledge in SQL programming and database management. 3 + years of solid experience with one or more Business Intelligence reporting tools such as Power BI, Qlik Sense, Looker or Tableau. Knowledge of data warehousing concepts and best practices. Excellent problem-solving and analytical skills. Detailed oriented with strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Skills: Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker and Looker ML, Data Studio, GCP QlikSense. Strong SQL skills and various BI/Reporting tools to build self-serve reports, analytic dashboards and ad-hoc packages leveraging our enterprise data warehouse. 1+ year experience with Python. 1+ year experience with Hive/Spark/Scala/JavaScript. Strong experience consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations. Development delivery experience. Solid understanding of BI tools, architectures, and visualization solutions. Inquisitive, proactive, and interested in learning new tools and techniques. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. Responsibilities Develop and maintain data pipelines, reporting and dashboards using SQL and Business Intelligence reporting tools such as Power BI, Qlik Sense and Looker. Develop and execute database queries by applying advanced knowledge of SQL and experience working with relational databases and Google BigQuery. Collaborate with stakeholders to define requirements from problem statements and develop data-driven insights. Perform data validation and code review to assure data accuracy and data quality/integrity across all systems. Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Qualifications Bachelor's degree in Computer Science, Computer Information Systems, or related field. 3 + years of hands-on experience in data modeling, ETL processes, developing reporting systems and data engineering (e.g., ETL, Big Query, SQL, Python or Alteryx) Advanced knowledge in SQL programming and database management. 3 + years of solid experience with one or more Business Intelligence reporting tools such as Power BI, Qlik Sense, Looker or Tableau. Knowledge of data warehousing concepts and best practices. Excellent problem-solving and analytical skills. Detailed oriented with strong communication and collaboration skills. Ability to work independently and as part of a team.
Posted 1 week ago
12.0 years
27 - 35 Lacs
Madurai, Tamil Nadu, India
On-site
Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership Skills:- Google Cloud Platform (GCP), GCP Data, Architect and Data architecture
Posted 1 week ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
2-4 years of experience using Microsoft SQL Server (version 2008 or later). Ability to create and maintain complex T-SQL queries, views, and stored procedures. 0 -1+ year experience performing advanced ETL development including various dataflow transformation tasks. Ability to monitor the performance and improve the performance by optimizing the code and by creating indexes. Proficient with Microsoft Access and Microsoft Excel Knowledge of descriptive statistical modeling methodologies and techniques such as classification, regression, and association activities to support statistical analysis in various healthcare data. Strong knowledge of Data Warehousing concepts Strong written, verbal and Customer service skills Proficiency in compiling data, creating reports and presenting information, including expertise with query, MS Excel and / or other such product like SSRS, Tableau, PowerBI, etc Proficiency on various data forms including but not limited to star and snowflake schemas. Ability to translate business needs into practical applications Desire to work within a fast-paced environment Ability to work in a team environment and be flexible in taking on various projects
Posted 1 week ago
0 years
0 Lacs
India
On-site
The Consulting Data Engineer role requires experience in both traditional warehousing technologies (e.g. Teradata, Oracle, SQL Server) and modern database/data warehouse technologies (e.g., AWS Redshift, Azure Synapse, Google Big Query, Snowflake), as well as expertise in ETL tools and frameworks (e.g. SSIS, Azure Data Factory, AWS Glue, Matillion, Talend), with a focus on how these technologies affect business outcomes. This person should have experience with both on-premise and cloud deployments of these technologies and in transforming data to adhere to logical and physical data models, data architectures, and engineering a dataflow to meet business needs. This role will support engagements such as data lake design, data management, migrations of data warehouses to the cloud, and database security models, and ideally should have experience in a large enterprise in these areas. Develops high performance distributed data warehouses, distributed analytic systems and cloud architectures Participates in developing relational and non-relational data models designed for optimal storage and retrieval Develops, tests, and debugs batch and streaming data pipelines (ETL/ELT) to populate databases and object stores from multiple data sources using a variety of scripting languages; provide recommendations to improve data reliability, efficiency and quality Works along-side data scientists, supporting the development of high-performance algorithms, models and prototypes Implements data quality metrics, standards, guidelines; automates data quality checks / routines as part of data processing frameworks; validates flow of information Ensures that Data Warehousing and Big Data systems meet business requirements and industry practices including but not limited to automation of system builds, security requirements, performance requirements and logging/monitoring requirements , Knowledge, Skills, And Abilities Ability to translate a logical data model into a relational or non-relational solution Expert in one or more of the following ETL tools: SSIS, Azure Data Factory, AWS Glue, Matillion, Talend, Informatica, Fivetran Hands on experience in setting up End to End cloud based data lakes Hands-on experience in database development using views, SQL scripts and transformations Ability to translate complex business problems into data-driven solutions Working knowledge of reporting tools like Power BI , Tableau etc Ability to identify data quality issues that could affect business outcomes Flexibility in working across different database technologies and propensity to learn new platforms on-the-fly Strong interpersonal skills Team player prepared to lead or support depending on situation"
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Role: We are currently hiring for the position of Director – LL5, who will serve as the owner and strategic leader of a multidisciplinary team comprising data scientists, data engineers, and software engineers. This role will oversee all stages of analytics services and product development, including: Defining and framing business problems Identifying relevant data sources Designing and developing analytical models Validating model performance and effectiveness Leading product deployment and launch initiatives This is a high-impact leadership opportunity to shape the future of data-driven decision-making at Ford. Scope: Drive AI and Analytics innovation for Finance Modernization Responsibilities Provide strategic leadership and oversight for the Finance and Insurance Analytics teams located in the U.S. and India. Drive Finance modernization initiatives within the Global Data Insight and Analytics (GDIA) organization. Deliver actionable insights through clear, compelling communication with business stakeholders and executive leadership. Foster effective collaboration and negotiation across all levels of the organization to achieve business outcomes. Partner with Product Line Owners to generate demand and align requirements with broader business objectives. Establish priorities, allocate resources, and ensure high-quality and timely delivery of all associated projects. Act as a key liaison for technical collaboration between Finance and Insurance Analytics and other GDIA departments including DPE, AIAC, ISA, and SSDA. Lead strategic optimization of team resources and budgetary planning. Anticipate future business needs and translate them into actionable initiatives and analytics projects. Oversee recruitment, onboarding, and professional development of team members to foster a high-performing, diverse workforce. Actively participate in hiring efforts and promote diversity, equity, and inclusion across the department. Manage purchased service engagements to support Finance and Insurance Analytics activities and delivery. Qualifications Master’s/bachelor’s degree in engineering or quantitative field. Proven hands-on expertise in Artificial Intelligence, with deep domain knowledge of Finance Modernization strategies and initiatives Skilled in leveraging big data technologies including SQL, Spark, and Hive to drive business insights Extensive experience with Google Cloud Platform and associated tools such as Python, Spark, Dataflow, BigQuery, GitHub, Qlik Sense, CPLEX, Mach1ML, and Power BI Demonstrated success in developing and deploying analytical models within cloud-based environments, particularly GCP Well-versed in advanced AI disciplines including Natural Language Processing (NLP), Deep Learning, and modern neural network architecture such as CNNs, RNNs, Embeddings, Transfer Learning, and Transformers Strong capabilities in business engagement, with a track record of translating complex problems into structured, impactful solutions Known for a meticulous attention to detail and a strong drive toward continuous improvement Adept at balancing innovation and analytical rigor, applying logical, methodical problem-solving approaches to dynamic challenges Highly articulate and credible communicator with strong presentation and interpersonal skills, able to influence at all levels of the organization Effective at managing multiple priorities, while maintaining high-quality outputs in fast-paced environments Desired: Extensive experience in Finance Analytics, encompassing data-driven financial insights, forecasting, performance analysis, and strategic decision support.
Posted 1 week ago
7.0 years
17 - 18 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Specialty Development Practitioner Location: Chennai Work Type: Full-time Budget: Up to ₹18 LPA Position Overview We are seeking a proactive and technically strong Full Stack Software Engineer to support the development and hosting of Supply Chain Analytics algorithms. The ideal candidate will be passionate about clean code, modern development practices, and continuous learning. You will be responsible for building scalable web applications, backend APIs, and data-driven tools by applying Agile practices like pair programming and Test-Driven Development (TDD) in a fast-paced environment. The role also includes mentoring teams, contributing to technical excellence, and actively engaging with leadership. Key Responsibilities Design and develop full-stack solutions using Python, Java, and Spring Boot. Build responsive web UIs using JavaScript, Angular, React, Vue, or TypeScript. Develop and consume RESTful APIs, integrate with Pub/Sub, Apigee, and Cloud Storage. Work with both relational (PostgreSQL, SQL Server) and NoSQL/columnar (BigQuery) databases. Deploy applications to cloud platforms such as Google Cloud Platform (GCP), AWS, Azure, or PCF. Apply DevOps and CI/CD practices using tools such as Jenkins, Tekton, Gradle, and Terraform. Collaborate with cross-functional teams in an Agile/Scrum setup using tools like JIRA. Must-Have Skills 5–7 years of hands-on software engineering experience. 3+ years in: Python, Java, Spring Boot Frontend frameworks – Angular/React/Vue REST API development Cloud experience (GCP preferred – Cloud Run, Cloud Storage, BigQuery). Working knowledge of relational & NoSQL databases. Strong understanding of Agile methodologies and TDD. Excellent written and verbal communication skills. Good To Have Experience with automated testing (unit, integration, E2E). Exposure to containerization (Docker, Kubernetes). Familiarity with DataFlow, Airflow, DataFusion, PySpark. Knowledge of CI/CD tools and infrastructure automation (Terraform, Tekton). Educational Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field. Skills: typescript,restful apis,python,pcf
Posted 1 week ago
5.0 years
17 - 18 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Software Engineer Practitioner – Full Stack Data Engineer Location: Chennai (Hybrid)-34305 Type: Full-time Compensation: Up to ₹18 LPA About The Role We are seeking a seasoned Full Stack Data Engineer to join our Enterprise Data Platform team. This role is crucial in designing, building, and optimizing scalable data pipelines on the Google Cloud Platform (GCP) , using native tools such as BigQuery, Dataform, Dataflow, and Pub/Sub . You will ensure best practices in data governance, security, and performance while collaborating closely with cross-functional teams. This is a high-impact opportunity to influence Ford’s data engineering architecture and contribute to the company’s digital transformation. Key Responsibilities Design, develop, and maintain robust, scalable data pipelines on GCP using BigQuery, Dataform, Dataflow, and Pub/Sub. Collaborate with data engineering, architecture, and product teams to build data models, solutions, and automation. Ensure data governance, security, auditability, and high performance across all pipelines. Build custom cloud-native solutions leveraging tools like Data Fusion, Airflow, and Terraform. Optimize data transformation workflows using Python (NumPy, Pandas, PySpark, etc.) and SQL. Engage with stakeholders to understand data requirements and translate them into scalable engineering solutions. Participate in Agile/Scrum processes, including writing user stories and contributing to sprint planning. Drive the adoption of best practices in data warehousing, data lake design, and DevOps. Must-Have Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 5+ years of experience in full stack data engineering and database performance optimization. Strong command of GCP native tools: BigQuery, Dataform, Dataflow, Pub/Sub, Data Fusion. Proficient in Python, Java, SQL, and data orchestration tools like Airflow. Experience with Terraform, Tekton, and version control (e.g., Git). Solid understanding of data architecture, ETL/ELT pipelines, and data governance. Deep familiarity with Agile methodology, DevOps, and collaborative product development. Excellent communication skills and stakeholder engagement experience. Preferred Skills (Nice To Have) Experience with PostgreSQL, Dataproc, Cloud SQL, and containerization tools. Knowledge of industrial or enterprise data products and real-time data streaming. Experience working in a regulated or large enterprise environment. Additional Information Interact with internal data and analytics product lines to identify technical opportunities. Influence design standards and ensure reusability and scalability of developed components. Support process improvements across data delivery, curation, and analytics operations. Skills: dataflow,python,java,gcp,etl
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes
Posted 1 week ago
3.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Google Cloud Architect in Pune (Hybrid) with over 10 years of experience, including 3+ years specifically on GCP, you will play a crucial role in leading the design and delivery of comprehensive cloud solutions on Google Cloud Platform. Your responsibilities will involve collaborating with data engineering, DevOps, and architecture teams to create scalable, secure, and cost-effective cloud platforms. Your key responsibilities will include designing scalable data and application architectures utilizing tools such as BigQuery, Dataflow, Composer, Cloud Run, Pub/Sub, and other related GCP services. You will be leading cloud migration, modernization, and CI/CD automation through the use of technologies like Terraform, Jenkins, GitHub, and Cloud Build. Additionally, you will be responsible for implementing real-time and batch data pipelines, chatbot applications using LLMs (Gemini, Claude), and automating reconciliation and monitoring processes. Your role will also involve collaborating closely with stakeholders to ensure technical solutions align with business objectives. The ideal candidate for this role should have a minimum of 3 years of experience working with GCP and possess a strong proficiency in key tools such as BigQuery, Dataflow, Cloud Run, Airflow, GKE, and Cloud Functions. Hands-on experience with Terraform, Kubernetes, Jenkins, GitHub, and cloud-native CI/CD is essential. In addition, you should have a solid understanding of DevSecOps practices, networking, and data architecture concepts like Data Lake, Lakehouse, and Mesh. Proficiency in Python, SQL, and ETL frameworks such as Ab Initio is also required. Preferred qualifications for this role include GCP Certifications (Cloud Architect, DevOps, ML Engineer), experience with Azure or hybrid environments, and domain expertise in sectors like Banking, Telecom, or Retail.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You will be part of a dynamic team at Equifax, where we are seeking creative, high-energy, and driven software engineers with hands-on development skills to contribute to various significant projects. As a software engineer at Equifax, you will have the opportunity to work with cutting-edge technology alongside a talented group of engineers. This role is perfect for you if you are a forward-thinking, committed, and enthusiastic individual who is passionate about technology. Your responsibilities will include designing, developing, and operating high-scale applications across the entire engineering stack. You will be involved in all aspects of software development, from design and testing to deployment, maintenance, and continuous improvement. By utilizing modern software development practices such as serverless computing, microservices architecture, CI/CD, and infrastructure-as-code, you will contribute to the integration of our systems with existing internal systems and tools. Additionally, you will participate in technology roadmap discussions and architecture planning to translate business requirements and vision into actionable solutions. Working within a closely-knit, globally distributed engineering team, you will be responsible for triaging product or system issues and resolving them efficiently to ensure the smooth operation and quality of our services. Managing project priorities, deadlines, and deliverables will be a key part of your role, along with researching, creating, and enhancing software applications to advance Equifax Solutions. To excel in this position, you should have a Bachelor's degree or equivalent experience, along with at least 7 years of software engineering experience. Proficiency in mainstream Java, SpringBoot, TypeScript/JavaScript, as well as hands-on experience with Cloud technologies such as GCP, AWS, or Azure, is essential. You should also have a solid background in designing and developing cloud-native solutions and microservices using Java, SpringBoot, GCP SDKs, and GKE/Kubernetes. Experience in deploying and releasing software using Jenkins CI/CD pipelines, infrastructure-as-code concepts, Helm Charts, and Terraform constructs is highly valued. Moreover, being a self-starter who can adapt to changing priorities with minimal supervision could set you apart in this role. Additional advantageous skills include designing big data processing solutions, UI development, backend technologies like JAVA/J2EE and SpringBoot, source code control management systems, build tools, working in Agile environments, relational databases, and automated testing. If you are ready to take on this exciting opportunity and contribute to Equifax's innovative projects, apply now and be part of our team of forward-thinking software engineers.,
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Data Engineer Location : Hyderabad / Pune (Preference) Joining : Immediate or within 15 Required : 3-5 Years Are you a Python & PySpark expert with hands-on experience in GCP? Passionate about building scalable, high-performance data pipelines? Join our fast-paced team and be part of impactful projects in the cloud Data Responsibilities Design, build, and maintain scalable and efficient data pipelines using Python and PySpark Develop and optimize ETL/ELT workflows for large-scale datasets Work extensively with Google Cloud Platform (GCP) services including BigQuery, Dataflow, and Cloud Functions Implement containerized solutions using Docker, and manage code through Git and CI/CD pipelines Collaborate with data scientists, analysts, and other engineers to deliver high-quality data solutions Monitor, troubleshoot, and improve the performance of data pipelines and Skills & Qualifications : Proficiency in Python, PySpark, and Big Data technologies Strong experience in ETL/ELT, data modeling, distributed computing, and performance tuning Hands-on expertise in GCP and its services Working knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer GCP certification is a plus Experience with Docker, CI/CD practices, and version control tools like Git (ref:hirist.tech)
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The AIML Architect-Dataflow, BigQuery position is a critical role within our organization, focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. You will combine advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that enhance decision-making processes across various departments. Your responsibilities will include building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in our data workflows. Collaboration with data engineers, data scientists, and application developers is essential to align with business goals and technical vision. You must possess a deep understanding of cloud-native architectures and be enthusiastic about leveraging cutting-edge technologies to drive innovation, efficiency, and insights from extensive datasets. You should have a robust background in data processing and AI/ML methodologies, capable of translating complex technical requirements into scalable solutions that meet the evolving needs of the organization. Key Responsibilities - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms for extracting insights from large datasets. - Optimize data storage and retrieval processes to improve performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Work closely with cross-functional teams to align data workflows with business objectives. - Conduct technical evaluations and assessments of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship and guidance to junior data engineers and analysts. - Stay updated on industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Required Qualifications - Bachelor's or Master's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, especially BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience in implementing machine learning solutions in cloud environments. - Solid programming skills in Python, Java, or Scala. - Expertise in SQL and other query optimization techniques. - Experience with big data workloads and distributed computing. - Familiarity with modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Proven track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous. Skills - Cloud Computing - SQL Proficiency - Dataflow - AIML - Scala - Data Governance - ETL Processes - Python - Machine Learning - Java - Google Cloud Platform - Data Architecture - Data Modeling - BigQuery - Data Engineering - Data Visualization Tools,
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Opportunity: Full Stack Data Engineer We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at Ford. Responsibilities What You'll Do: ( Responsibilities) Data Pipeline Architect & Builder: Spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Dataform, to build robust and efficient data pipelines. End-to-End Integration Expert: Utilize your full-stack skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. GCP Data Solutions Leader : Leverage your deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that not only meet but exceed business needs and expectations. Data Governance & Security Champion : Implement and manage robust data governance policies, access controls, and security best practices, fully utilizing GCP's native security features to protect sensitive data. Data Workflow Orchestrator : Employ Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing best practices in Infrastructure as Code (IaC). Performance Optimization Driver : Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborative Innovator : Collaborate effectively with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Automation & Reliability Advocate : Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Effective Communicator : Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Continuous Learner : Stay ahead of the curve by continuously learning about industry trends and emerging technologies, proactively identifying opportunities to improve our data platform and enhance our capabilities. Business Impact Translator : Translate complex business requirements into optimized data asset designs and efficient code, ensuring that our data solutions directly contribute to business goals. Documentation & Knowledge Sharer : Develop comprehensive documentation for data engineering processes, promoting knowledge sharing, facilitating collaboration, and ensuring long-term system maintainability. Qualifications What You'll Bring: (Qualifications) Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). 3-5 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc. Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform. Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues. Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc). A passion for data, innovation, and continuous learning.
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred Technical And Professional Experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position - Python Developer (GCP) Experience - 5+Years Location - Bangalore/ WFO Job Description We are looking for an experienced 5+ years Python Developer with hands-on expertise in Google Cloud Platform (GCP) services. The ideal candidate will be responsible for developing and deploying scalable applications, data processing pipelines, and cloud-native solutions using Python and GCP tools. Responsibilities Develop and maintain Python-based applications and services. Design and implement data pipelines and cloud functions using GCP services such as Cloud Functions, Cloud Run, Pub/Sub, Dataflow, and Big Query. Integrate APIs and third-party services. Optimize performance and scalability of cloud-based applications. Collaborate with DevOps and data teams to build CI/CD pipelines and manage infrastructure using tools like Terraform or Deployment Manager. Write clean, maintainable, and well-documented code. Troubleshoot and resolve technical issues in production and development environments. Perks & Benefits Health and Wellness: Healthcare policy covering your family and parents. Food: Enjoy a scrumptious buffet lunch at the office every day (For Bangalore) Professional Development: Learn and propel your career. We provide workshops, funded online courses and other learning opportunities based on individual needs. Rewards and Recognitions: Recognition and rewards programs in place to celebrate your achievements and contributions. Why join Relanto? Health & Family: Comprehensive benefits for you and your loved ones, ensuring well-being. Growth Mindset: Continuous learning opportunities to stay ahead in your field. Dynamic & Inclusive: Vibrant culture fostering collaboration, creativity, and belonging. Career Ladder: Internal promotions and clear path for advancement. Recognition & Rewards: Celebrate your achievements and contributions. Work-Life Harmony: Flexible arrangements to balance your commitments.
Posted 1 week ago
25.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
The Company Dexcom Corporation (NASDAQ DXCM) is a pioneer and global leader in continuous glucose monitoring (CGM). Dexcom began as a small company with a big dream: To forever change how diabetes is managed. To unlock information and insights that drive better health outcomes. Here we are 25 years later, having pioneered an industry. And we're just getting started. We are broadening our vision beyond diabetes to empower people to take control of health. That means personalized, actionable insights aimed at solving important health challenges. To continue what we've started: Improving human health. We are driven by thousands of ambitious, passionate people worldwide who are willing to fight like warriors to earn the trust of our customers by listening, serving with integrity, thinking big, and being dependable. We've already changed millions of lives and we're ready to change millions more. Our future ambition is to become a leading consumer health technology company while continuing to develop solutions for serious health conditions. We'll get there by constantly reinventing unique biosensing-technology experiences. Though we've come a long way from our small company days, our dreams are bigger than ever. The opportunity to improve health on a global scale stands before us. Meet The Team We’re a collaborative and innovative software quality team focused on ensuring the reliability and performance of Dexcom’s continuous glucose monitoring (CGM) systems. Our mission is to build quality into every stage of the development lifecycle through smart automation, rigorous testing, and a passion for improving lives. If you're eager to grow your skills while contributing to life-changing technology, this is the team for you. Where You Come In You participate in building quality into products by writing automated tests and performing ad hoc testing throughout the development cycle. You contribute to the development of software requirements and design specifications. You design, develop, execute, and maintain both automated and manual test scripts to validate Dexcom CGM software and systems. You create verification and validation test plans, traceability matrices, and test reports, and review them with relevant stakeholders. You record and track issues using the bug tracking system. You analyze test failures and collaborate with development teams to investigate root causes. You contribute to continuous improvement of the release process. What makes you successful: You have 1–3 years of hands-on experience in software development or software test development using Python or other object-oriented programming languages. You have experience with SQL and NoSQL databases. You bring experience in automated test development for API testing. You have worked with a variety of automated testing frameworks, including Robot Framework. Your strong understanding of API testing, microservices, and distributed systems in cloud environments sets you apart. You have experience with automated UI testing. You are familiar with cloud platforms like Google Cloud or AWS. You have experience with containerization tools such as Docker and Kubernetes. You may have experience in the medical device industry and familiarity with FDA design control processes (highly desired). Your knowledge of GCP tools like Airflow, Dataflow, and BigQuery is a plus. Experience with distributed event streaming platforms like Kafka is a plus. Experience with performance testing is a plus. You bring CI/CD experience, especially with cloud-based technologies. You have five or more years of Agile development and test development experience. You collaborate effectively across functions to support testing, deployment, and reporting on product performance, quality, security, and stability. You are a self-starter who works well with minimal guidance and communicates clearly both verbally and in writing. To all Staffing and Recruiting Agencies: Our Careers Site is only for individuals seeking a job at Dexcom. Only authorized staffing and recruiting agencies may use this site or to submit profiles, applications or resumes on specific requisitions. Dexcom does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to the Talent Acquisition team, Dexcom employees or any other company location. Dexcom is not responsible for any fees related to unsolicited resumes/applications.
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
GCP- Goggle Cloud Engineer - Chennai GCP Engineer Job Summary: We are seeking a highly skilled and experienced GCP Engineer with a strong background in deployment and automation. The ideal candidate will be a subject matter expert in Google Cloud Platform, responsible for providing advanced technical support, troubleshooting complex issues, and ensuring the reliability, scalability, and performance of our cloud infrastructure. This role requires a blend of deep technical expertise in GCP services and hands-on experience in automating infrastructure provisioning and application deployments. Key Responsibilities: Advanced Technical Support: Serve as the final escalation point for all GCP-related technical issues. Diagnose and resolve complex problems related to infrastructure, networking, security, and application performance. Incident and Problem Management: Lead the resolution of major incidents, perform root cause analysis (RCA), and implement preventative measures to minimize future occurrences. Infrastructure as Code (IaC) and Automation: Design, build, and maintain our cloud infrastructure using IaC principles with tools like Terraform. Develop and enhance automation scripts using Python or Bash to streamline operational tasks. Monitoring, Logging, and Alerting: Implement and manage comprehensive monitoring and logging solutions using Google Cloud's operations suite (formerly Stackdriver) and other third-party tools to ensure proactive issue detection and resolution. Security and Compliance: Implement and enforce security best practices within the GCP environment, including IAM policies, network security controls, and data encryption. Ensure compliance with industry standards. Mentorship and Collaboration: Provide technical guidance and mentorship to L1/L2 support engineers. Collaborate closely with development, operations, and security teams to foster a culture of reliability and automation. Required Qualifications: Experience: 5+ years of experience in a cloud engineering, with at least 3 years of hands-on experience with Google Cloud Platform. GCP Expertise: In-depth knowledge of core GCP services including Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, VPC, Cloud SQL, and BigQuery. Infrastructure as Code (IaC): Proven experience with Terraform for provisioning and managing cloud infrastructure. Scripting: Proficiency in scripting languages such as Python or Bash for automation. Containerization: Strong understanding of Docker and container orchestration with Kubernetes. Troubleshooting: Demonstrated ability to troubleshoot complex technical issues across the entire technology stack. Preferred Qualifications: Certifications: Google Cloud Certified - Professional Cloud Architect, Professional Cloud DevOps Engineer, or Professional Cloud Security Engineer. Networking: Advanced knowledge of cloud networking concepts, including VPNs, interconnects, and firewall rules. Databases: Experience with managing and optimizing relational and NoSQL databases in a cloud environment. Big Data: Familiarity with GCP's data and analytics services like Dataflow, Dataproc, and Pub/Sub. Communication: Excellent verbal and written communication skills, with the ability to articulate technical concepts to both technical and non-technical audiences.
Posted 1 week ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Technology Lead Location: Pune, Maharashtra, India Experience Level: 8+ Years Role Overview: We are seeking an accomplished and highly driven Tech Lead to spearhead the technical direction, architecture, and hands-on development of our critical data and application infrastructure on Google Cloud Platform. This role is for a visionary leader who thrives on solving complex technical challenges, driving innovation with AI/LLM capabilities, and ensuring the delivery of scalable, highly available, and secure solutions across the complete software development lifecycle. If you possess a profound technical depth, exceptional leadership skills, and a "Get Things Done" attitude, you will be instrumental in shaping our technological future. Key Responsibilities: Technical Leadership & Architectural Ownership: Drive the technical vision, architecture, and design for complex, large-scale software solutions on GCP, ensuring high quality, scalability, maintainability, and adherence to best practices. Lead and mentor a team of engineers, fostering a culture of technical excellence, continuous learning, and innovation, providing deep technical and architectural guidance. Own the technical roadmap and strategic initiatives, influencing key stakeholders and driving architectural decisions across the tech organization. Conduct rigorous code and design reviews, ensuring solutions meet stringent technical standards and business requirements. GCP, Data Engineering & Cloud Operations: Architect, build, and optimize robust, high-performance data pipelines and ETL/ELT processes using Apache Airflow for orchestration. Lead the development and management of comprehensive data solutions utilizing Google Cloud Platform (GCP) services, including BigQuery for data warehousing, PostgreSQL for relational databases, Dataflow for stream/batch processing, and Pub/Sub for real-time messaging. Ensure paramount data integrity, cloud security, and optimal performance across all data systems. Oversee infrastructure management, resource monitoring, and cost optimization strategies within the cloud environment. Design and implement systems for exceptional scalability and high availability, managing complex networking with cloud and infrastructure. AI/LLM Integration & Strategic Automation: Lead the strategy, design, and implementation of Large Language Models (LLMs) and advanced AI coding tools into core engineering workflows to significantly enhance developer productivity and automate complex processes. Drive initiatives for end-to-end workflow automation across the software development lifecycle, leveraging AI, scripting, and advanced tooling. Lead POCs with new technologies and AI POCs, conducting benchmarking of AI tools to identify and champion the right solutions for the team and organization. Problem Solving & Innovation: Proactively identify, diagnose, and resolve highly complex technical issues, providing innovative and sustainable solutions. Champion the adoption of new technologies and methodologies to continuously improve system performance, reliability, and the overall developer experience. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate extensively with product owners, data scientists, and other engineering teams to translate business needs into robust technical solutions and build sales-centric features. Engage directly with customers to gather insights, provide technical guidance, and ensure solutions meet their evolving needs. Clearly articulate complex technical concepts and strategic decisions to diverse technical and non-technical stakeholders. Manage the complete software development lifecycle from conception to deployment and ongoing operations. Required Skills & Qualifications: 8+ years of progressive experience in software engineering, with a strong emphasis on backend development, cloud-native architectures, and technical leadership roles. Expert-level proficiency in Python, including extensive experience with Django and Flask frameworks. Deep expertise in Google Cloud Platform (GCP) services, including BigQuery, PostgreSQL, Airflow, Dataflow, and Pub/Sub. Proven track record in architecting, designing, and maintaining scalable, high-performance data pipelines and distributed systems. Extensive hands-on experience with LLM integration (e.g., OpenAI GPT-4, Vertex AI Gemini) and leveraging AI coding tools (e.g., GitHub Copilot, Cursor, etc.) for enhanced productivity. Demonstrated ability to implement and optimize complex workflow automation. Solid understanding of relational databases and SQL, with advanced proficiency in PostgreSQL. Strong experience with version control systems (Git) and designing/implementing robust CI/CD pipelines (GitHub Actions, etc.). Deep understanding of microservices architecture, serverless applications, and API design principles. Expertise in containerization technologies (Docker, Kubernetes). Understanding of FHIR (Fast Healthcare Interoperability Resources) standards for healthcare data exchange. Proven experience with Cloud Security best practices and implementation. Demonstrable experience in infrastructure management, resource monitoring, and cloud cost optimization. A track record of designing and building systems for exceptional scalability and high availability. Strong understanding of core software engineering principles, design patterns, and architectural best practices. Exceptional problem-solving, analytical, and critical thinking skills. Outstanding communication, interpersonal, and leadership skills, with a proven ability to influence and guide technical teams. A "Get Things Done" attitude, high energy, and a passion for continuous learning and innovation. Bonus Points (Nice to Have): Experience with Infrastructure as Code (e.g., Terraform). Experience with other major cloud platforms (e.g., AWS Lambda, S3, RDS). Experience with streaming technologies like Kafka. Familiarity with visualization and analytics tools (Metabase, Apache Superset). Contributions to open-source projects or technical communities.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough