Jobs
Interviews

1133 Dataflow Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Impetus is hiring for good GCP Data Engineers, If you are good in Bigdata, Spark, pyspark & GCP-Pub Sub, Dataproc, Big query etc & you are immediate joiner & can join us in 0-30 days, please share your resume at rashmeet.g.tuteja@impetus.com. Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Should have strong experience in Bigdata, Spark, Pyspark & Python. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Impetus is hiring for good GCP Data Engineers If you are good in Bigdata, Spark, pyspark & GCP-Pub Sub, Dataproc, Big query etc & you are immediate joiner & can join us in 0-30 days, please share your resume at vaishali.tyagi@impetus.com. Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Should have strong experience in Bigdata, Spark, Pyspark & Python. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Exp- 8+ Years Location - Pune Notice Period: 15 Days or Immediate joiner Job Summary: We are seeking a highly skilled and experienced SQL and BigQuery Expert with 8–9 years of hands-on experience in data engineering, data analytics, and cloud-based data warehousing solutions. The ideal candidate will be responsible for designing, optimizing, and maintaining large-scale datasets and advanced SQL queries using Google BigQuery, with a strong focus on performance, scalability, and reliability. Key Responsibilities: Design and optimize complex SQL queries for data transformation and reporting. Develop and manage BigQuery data pipelines using native SQL, PL/SQL, and GCP services (e.g., Cloud Storage, Dataflow, Pub/Sub). Work closely with data engineers, analysts, and business stakeholders to understand data requirements. Automate data ingestion and transformation workflows using scheduled queries or orchestration tools (e.g., Cloud Composer / Airflow). Perform query performance tuning and troubleshoot latency issues in BigQuery . Implement best practices for cost optimization, data partitioning, clustering, and access control. Develop data marts and maintain semantic layers for BI tools (e.g., Looker, Tableau, Power BI). Ensure data quality, governance, and security standards are followed. Required Skills and Qualifications: 8–9 years of experience in advanced SQL development and performance tuning. Minimum 2–3 years of strong hands-on experience with Google BigQuery in a production environment. Solid understanding of data warehousing concepts and best practices. Experience with cloud platforms (GCP preferred; AWS/Azure is a plus). Familiarity with scripting languages such as Python or Shell. Experience with version control (Git) and CI/CD workflows for data projects. Knowledge of LookerML or other BI integration with BigQuery is a plus. Experience in working in Agile/Scrum environments.

Posted 1 month ago

Apply

5.0 years

0 Lacs

India

Remote

Position Title: MLOps Engineer Experience: 5+ Years Location: Remote Employment Type: Full-Time About the Role: We are looking for an experienced MLOps Engineer to lead the design, deployment, and maintenance of scalable and production-grade machine learning infrastructure. The ideal candidate will have a strong foundation in MLOps principles, expertise in GCP (Google Cloud Platform), and a proven track record in operationalizing ML models in cloud environments. Key Responsibilities: Design, build, and maintain scalable ML infrastructure on GCP using tools such as Vertex AI, GKE, Dataflow, BigQuery, and Cloud Functions. Develop and automate ML pipelines for training, validation, deployment, and monitoring using Kubeflow Pipelines, TFX, or Vertex AI Pipelines. Collaborate closely with Data Scientists to transition models from experimentation to production. Implement robust monitoring systems for model drift, performance degradation, and data quality issues. Manage containerized ML workloads using Docker and Kubernetes (GKE). Set up and manage CI/CD workflows for ML systems using Cloud Build, Jenkins, Bitbucket, or similar tools. Ensure model security, versioning, governance, and compliance throughout the ML lifecycle. Create and maintain documentation, reusable templates, and artifacts for reproducibility and audit readiness. Required Skills & Experience: Minimum 5 years of experience in MLOps, ML Engineering, or related roles. Strong programming skills in Python with experience in ML frameworks and libraries. Hands-on experience with GCP services including Vertex AI, BigQuery, GKE, and Dataflow. Solid understanding of machine learning concepts and algorithms such as XGBoost and classification models. Experience with container orchestration using Docker and Kubernetes. Proficiency in implementing CI/CD practices for ML workflows. Strong analytical, problem-solving, and communication skills.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

What Your Impact Will Be Lead the development of scalable, secure, and high-performing data integration pipelines for structured and semi-structured data using Google BigQuery. Design and develop scalable data integration pipelines to ingest structured and semi-structured data from enterprise systems (e.g., ERP, CRM, E-commerce, Order Management) into a centralized cloud data warehouse using Google BigQuery. Build analytics-ready pipelines that transform raw data into trusted, curated datasets for reporting, dashboards, and advanced analytics. Implement transformation logic using DBT to create modular, maintainable, and reusable data models that evolve with business needs. Apply BigQuery best practicesincluding partitioning, clustering, and query optimizationto ensure high performance and scalability. Automate and monitor complex data workflows using Airflow/Cloud Composer, ensuring dependable pipeline orchestration and job execution. Develop efficient, reusable Python and SQL code for data ingestion, transformation, validation, and performance tuning across the pipeline lifecycle. Establish robust data quality checks and testing strategies to validate both technical accuracy and alignment with business logic. Partner with architects and Technical leads to establish best practices, scalable frameworks, and reference implementations across projects. Collaborate with cross-functional teamsincluding data analysts, BI developers, and product ownersto understand integration needs and deliver impactful, business-aligned data solutions. Leverage modern ETL platforms such as Ascend.io, Databricks, Dataflow, or Fivetran to accelerate development and improve observability and orchestration. Contribute to technical documentation, CI/CD workflows, and monitoring processes to drive transparency, reliability, and continuous improvement across the data engineering ecosystem. Mentor junior engineers, conduct peer code reviews, and lead technical Were Looking For : Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related technical field. Minimum 4+ years of hands-on experience in data engineering with strong expertise in data warehousing, pipeline development, and analytics on cloud experience in : Google BigQuery for large-scale data warehousing and analytics. Python for data processing, orchestration, and scripting. SQL for data wrangling, transformation, and query optimization. DBT for developing modular and maintainable data transformation layers. Airflow / Cloud Composer for workflow orchestration and scheduling. Proven experience building enterprise-grade ETL/ELT pipelines and scalable data architectures. Strong understanding of data quality frameworks, validation techniques, and governance processes. Proficiency in Agile methodologies (Scrum/Kanban) and managing IT backlogs in a collaborative, iterative experience with : Tools like Ascend.io, Databricks, Fivetran, or Dataflow. Data cataloging/governance tools (e.g., Collibra). CI/CD tools, Git workflows, and infrastructure automation. Real-time/event-driven data processing using Pub/Sub, Kafka, or similar platforms. Strategic problem-solving skills and ability to architect innovative solutions. Ability to adapt quickly to new technologies and lead adoption across teams. Excellent communication skills and ability to influence cross-functional teams. Good experience on Agile Methodologies like Scrum, Kanban, and managing IT backlog. Be a go-to expert for data technologies and solutions (ref:hirist.tech)

Posted 1 month ago

Apply

4.0 years

0 Lacs

Greater Hyderabad Area

Remote

Job Description : Data Engineer (Bangalore, India - Hybrid/Remote Considered) Experience : - 4+ Years Notice Period : Immediate Joiner / to Max of 15 Days Notice Period Role Overview We are looking for two Data Engineers with strong experience in SAP, GCP, and From finance/Revenue domains to join our team in Bangalore, India. These roles require hands-on technical expertise in building scalable data solutions, i ntegrating SAP data with cloud platforms , and working on finance-related data projects. Key Responsibilities Develop and maintain data pipelines for processing finance and revenue-related data from SAP to GCP. Work with Google Cloud services (Big Query, Dataflow, Pub/Sub, Cloud Composer, Dataproc) to manage and optimize data processing. Extract and transform SAP data, ensuring seamless integration into cloud-based analytics platforms. Write optimized SQL, Python, and Spark scripts to support ETL/ELT workflows. Ensure data security, governance, and compliance best practices for financial datasets. Collaborate with cross-functional teams to support data analytics and reporting needs. Required Qualifications 4-6 years of hands-on experience in data engineering, ETL, and cloud data platforms. Strong expertise in SQL, Python, and Spark. Experience working with SAP Finance or Revenue modules and integrating SAP data with cloud platforms. Proficiency in Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Pub/Sub. Good understanding of data modeling, ETL/ELT processes, and large-scale data processing. Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications Experience with SAP BODS, SAP HANA, or SAP Data Services for data extraction. Knowledge of data governance, compliance, and cloud cost optimization. Exposure to Terraform, Git, and DevOps practices. Familiarity with AI/ML applications in financial data analytics. Why Join Us? Be part of a fast-growing AI-first data solutions company. Work on global projects with top-tier financial and technology enterprises. Competitive salary and career growth opportunities in AI and cloud data solutions. Hybrid work environment (Bangalore-based, remote flexibility for exceptional candidates). Important Note If performance is an issue - candidate must be ready to re-locate to Bengaluru KA (Work from Office ) / Hybrid Mode ( Its not fully remote opportunity - Hybrid Opportunity Mix of WFH- Remote). Notice Period - Immediate to Early Joiners 15-20 Days Time (maximum) (ref:hirist.tech)

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Job Description We are seeking a highly skilled and experienced GCP Cloud Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering with a focus on Google Cloud Platform (GCP) services. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure on GCP, ensuring data is accessible, reliable, and available for business use. Key Responsibilities Data Pipeline Development: Design, develop, and maintain data pipelines using GCP services such as Dataflow, Dataproc, BigQuery, and Cloud Storage. Data Integration: Work on integrating data from various sources (structured, semi-structured, and unstructured) into GCP environments. Data Modeling: Develop and maintain efficient data models in BigQuery to support analytics and reporting needs. Data Warehousing: Implement data warehousing solutions on GCP, optimizing performance and scalability. ETL/ELT Processes: Build and manage ETL/ELT processes using tools like Apache Airflow, Data Fusion, and Python. Data Quality & Governance: Implement data quality checks, data lineage, and data governance best practices to ensure high data integrity. Automation: Automate data pipelines and workflows to reduce manual effort and improve efficiency. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business needs. Optimization: Continuously monitor and optimize the performance of data pipelines and queries for cost and efficiency. Security: Ensure data security and compliance with industry standards and best practices. Required Skills & Qualifications Education: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 5+ years of experience in data engineering, with at least 2+ years working with GCP. Technical Skills Proficiency in GCP services: BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Cloud Functions. Strong programming skills in Python, SQL,Pyspark and familiarity with Java/Scala. Experience with orchestration tools like Apache Airflow. Knowledge of ETL/ELT processes and tools. Experience with data modeling and designing data warehouses in BigQuery. Familiarity with CI/CD pipelines and VC- version control systems like Git. Understanding of data governance, security, and compliance. Soft Skills Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work in a fast-paced environment and manage multiple priorities. Preferred Qualifications: Good to have : Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification. Domain Knowledge: Experience in the finance, e-commerce, healthcare domain is a plus. (ref:hirist.tech)

Posted 1 month ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What You’ll Do As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What Experience You Need Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What Could Set You Apart Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Perform general application development activities, including unit testing, code deployment to development environment and technical documentation. Work on one or more projects, making contributions to unfamiliar code written by team members. Diagnose and resolve performance issues. Participate in the estimation process, use case specifications, reviews of test plans and test cases, requirements, and project planning. Document code/processes so that any other developer is able to dive in with minimal effort. Develop, and operate high scale applications from the backend to UI layer, focusing on operational excellence, security and scalability. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit engineering team employing agile software development practices. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Write, debug, and troubleshoot code in mainstream open source technologies Lead effort for Sprint deliverables, and solve problems with medium complexity Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years working experience software development using the most recent version of Python 3+ years experience with software build management tools like Maven or Gradle 3+ years experience with CI/CD Jenkins pipeline development and backend coding 3+ years experience with software testing, performance, and quality engineering techniques and strategies 3+ years experience with Cloud technology: GCP, AWS, or Azure is preferable Experience and familarity with the various Python frameworks currently in use to leverage software development processes What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern Python version We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

8.0 - 12.0 years

12 - 18 Lacs

Noida, Pune, Bengaluru

Work from Office

Lead the technical discovery process, assess customer requirements, and design scalable solutions leveraging a comprehensive suite of Data & AI services, including BigQuery, Dataflow, Vertex AI, Generative AI solutions, and advanced AI/ML services like Vertex AI, Gemini, and Agent Builder. Architect and demonstrate solutions leveraging generative AI, large language models (LLMs), AI agents, and agentic AI patterns to automate workflows, enhance decision-making, and create intelligent applications. Develop and deliver compelling product demonstrations, proofs-of-concept (POCs), and technical workshops that showcase the value and capabilities of Google Cloud. Strong understanding of data warehousing, data lakes, streaming analytics, and machine learning pipelines. Collaborate with sales to build strong client relationships, articulate the business value of Google Cloud solutions, and drive adoption. Lead and contribute technical content and architectural designs for RFI/RFP responses and technical proposals leveraging Google Cloud Services. Stay informed of industry trends, competitive offerings, and new Google Cloud product releases, particularly in the infrastructure and data/AI domains. Extensive experience in architecting & designing solutions on Google Cloud Platform, with a strong focus on: Data & AI services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI (ML Ops, custom models, pre-trained APIs), Generative AI (e.g., Gemini). Strong understanding of cloud architecture patterns, DevOps practices, and modern software development methodologies. Ability to work effectively in a cross-functional team environment with sales, product, and engineering teams. 5+ years of experience in pre-sales or solutions architecture, focused on cloud Data & AI platforms. Skilled in client engagements, technical presentations, and proposal development. Excellent written and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Location-Noida,Pune,Bengaluru,Hyderabad,Chennai

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide. Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves. Team Five9 is a leading provider of cloud software for the enterprise contact center market, powering more than three billion customer interactions annually. Since 2001, Five9 has pioneered the cloud revolution in contact centers, helping businesses transition from legacy systems to the cloud. Our cloud-based solutions are reliable, secure, scalable, and designed to enhance customer experiences, improve agent productivity, and deliver measurable business results. This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States. About The Role We are seeking an experienced Site Leader to drive strategy, execution, and leadership for Five9 Voice Services . This role will oversee high-performing engineering teams, ensuring the development of scalable, reliable microservices while fostering a culture of innovation and operational excellence. Key Responsibilities Lead and grow high-performing teams, promoting a culture of collaboration, accountability, and innovation. Oversee the execution of product engineering, ensuring successful delivery of the Five9 Voice Services roadmap. Design, implement, and maintain REST APIs to support workflow automation and seamless integration with internal tools and external systems. Develop and automate dashboards that provide key performance insights to both internal teams and customers, ensuring actionable metrics for performance tracking. Integrate robust QA practices into the development lifecycle, including automated testing frameworks and continuous integration for high code quality and reliability. Define, develop, and maintain end-to-end journey tests to ensure consistent and seamless user experiences across voice workflows and services. Analyze large-scale service data to uncover trends, detect data deviations for alerts, and generate insights to improve service reliability. Define and drive technical requirements, ensuring alignment with business goals and customer needs. Lead technical and business discussions, ensuring the right decisions are made to solve complex challenges. Required Skills 8+ years of demonstrated experience in site leadership, driving local engineering culture, team development, and cross-functional collaboration. 5+ years of experience developing event streaming applications that correlate, decorate, and calculate streaming events in real time. 5+ years of experience with modern enterprise frameworks such as SpringBoot microservices, DevOps environments, Docker, and Kubernetes. 5+ years of experience designing, implementing, and consuming REST APIs, with a focus on enabling automation and integration. Hands-on experience integrating QA into the development lifecycle, including automated testing, CI/CD pipelines, and regression testing. Experience designing and maintaining journey tests that validate end-to-end user workflows across complex systems and services. Strong knowledge of cloud technologies such as Google Cloud Platform (GCP), Pub/Sub, Dataflow, BigQuery, and Cloud Functions. Diversity & Inclusion Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer. Location Our headquarters are located in the beautiful Bishop Ranch Business Park in San Ramon, CA. Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer. View our privacy policy, including our privacy notice to California residents here: https://www.five9.com/pt-pt/legal. Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Engineer with AI/ML expertise to join our technology team. The ideal candidate will have a strong background in cloud engineering, machine learning deployment, and automation, with at least 4 years of experience working on end-to-end cloud-based solutions. This role involves building scalable infrastructure, deploying machine learning models into production, and working closely with data scientists, DevOps, and software engineers to drive intelligent, data-driven solutions. Key Responsibilities: Cloud Infrastructure (GCP) Design, implement, and manage scalable, secure, and high-performance infrastructure on Google Cloud Platform . Build and optimize CI/CD pipelines for ML model training and deployment. Develop and maintain GCP services such as GKE, BigQuery, Cloud Functions, Vertex AI, Dataflow, Pub/Sub, Cloud Storage, IAM, and Cloud Composer. Automate infrastructure provisioning using Terraform, Deployment Manager , or similar IaC tools. Monitor system performance, cost optimization, and ensure high availability and disaster recovery. AI/ML Engineering Collaborate with Data Science teams to operationalize ML models and deploy them into production using Vertex AI or Kubeflow Pipelines. Implement and manage ML workflows , including data ingestion, training, tuning, and serving. Support MLOps practices including versioning, testing, and monitoring of ML models. Ensure compliance with model governance, security, and ethical AI practices. Collaboration & Support Provide technical mentorship to junior engineers and ML Ops professionals. Work with cross-functional teams including product, engineering, and data to ensure timely delivery of projects. Create documentation, knowledge bases, and SOPs for deployment and operations processes. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. 4+ years of professional experience in cloud engineering or AI/ML infrastructure, with at least 2 years on GCP. Hands-on experience with GCP tools such as Vertex AI, BigQuery, GKE, Cloud Functions, and Dataflow. Proficient in Python , SQL , and scripting for automation and data processing. Strong knowledge of MLOps principles , model versioning, monitoring, and rollback strategies. Experience with containerization (Docker) and orchestration (Kubernetes/GKE). Familiarity with CI/CD tools (e.g., Jenkins, Cloud Build, GitLab CI/CD). Experience with infrastructure as code (IaC) tools like Terraform or Ansible . Preferred Qualifications GCP certifications such as Professional Cloud Architect , Professional Data Engineer , or Machine Learning Engineer . Experience with real-time data processing and streaming analytics (e.g., using Pub/Sub + Dataflow). Familiarity with data governance , security best practices , and GDPR/PII compliance in ML applications. Exposure to multi-cloud or hybrid cloud environments . Knowledge of ML frameworks like TensorFlow, PyTorch, Scikit-learn, or XGBoost. Key Competencies Strong problem-solving skills and analytical thinking. Excellent communication and collaboration abilities. Proactive mindset with a focus on innovation and continuous improvement. Adaptable in a fast-paced, evolving technological environment.

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Req ID: 327059 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Pyspark Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Strong hands-on experience in designing and building data pipelines using Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Cloud Composer. Proficient in Python for data processing, scripting, and automation in cloud and distributed environments. Solid working knowledge of Apache Spark / PySpark, with experience in large-scale data transformation and performance tuning. Familiar with CI/CD processes, version control (Git), and workflow orchestration tools such as Airflow or Composer. Ability to work independently in fast-paced Agile environments with strong problem-solving and communication skills. Exposure to modern data architectures and real-time/streaming data solutions is an added advantage. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .

Posted 1 month ago

Apply

2.0 years

3 - 6 Lacs

Hyderābād

On-site

About the job Our Hubs are a crucial part of how we innovate, improving performance across Sanofi departments and providing a springboard for the amazing work we do. Build a career and you can be part of transforming our business while helping to change millions of lives. Ready? As a Senior Biomarker Biostatistician within our Statistics Team at Hyderabad , you will play a crucial role in developing and implementing different statistical and machine learning algorithms to solve complex problems and support our clinical biomarker data insights generation. You will work closely with cross-functional teams to ensure data is accessible, reliable, and optimized for analysis. Your expertise in omics data, machine learning and deep learning will be essential in driving our data initiatives forward. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main responsibilities: Provide high quality input regarding TM/Biomarker aspects into the design of the clinical study (including protocol development), the setup of the study to make sure biomarker data are adequately captured and collected to answer the study objectives and to support the planned statistical analyses under guidance of project biomarker statistical lead. Coordinate the activities of external partners and CROs for biomarker data generation, dataflow or biomarker statistical activities Perform pre-processing and normalization of biomarker data (e.g., RNAseq, scRNAseq, Olink, flow cytometry data etc.) Perform and/or coordinate with the programming team the production of the definitions, documentation and review of derived variables, as well as the quality control plan. Perform and/or coordinate with study programmer the production of biomarker statistical analyses. Review and examine statistical data distributions/properties. Oversee execution of the statistical analyses according to the SAP, prepare statistical methods & provide statistical insight into interpretation and discussion of results sections for the clinical study report (CSR) and/or publications to ensure the statistical integrity of the content according to internal standards and regulatory guidelines and in compliance with SOPs. Propose, prepare and perform exploratory biomarker data analyses, ad-hoc analyses as relevant for the study, project objectives or publication. About you Experience : 2+ years (Master) or 1+ years (PhD) of pharmaceutical or related industry experience Soft and technical skills : o Basic knowledge of pharmaceutical clinical development o Good knowledge and good understanding of key statistical concepts and techniques, in particular high-dimensional statistics o Good knowledge in handling complex biomarker data (e.g., RNAseq, scRNAseq, Olink, flow cytometry data etc.)), knowledge in pathway level and network analyses is a plus. o Able to work in departmental computing environment, do advanced statistical analyses using R&R-Shiny and possibly other languages (python, C++, …) o Demonstrated interpersonal and communication skills and able to work in cross functional and global team setting Education : Master’s or Ph.D. in Data Science, Bioinformatics, Statistics, or a related field. Soft and technical skills :

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Global Data Insight & Analytics organization is looking for a top-notch Software Engineer who has also got Machine Learning knowledge & Experience to add to our team to drive the next generation of Cloud platform Fullstack Developers. In this role you will work in a small, cross-functional team. The position will collaborate directly and continuously with other engineers, business partners, product managers and designers from distributed locations, and will release early and often. The team you will be working on is focused on building Cloud platform to democratize Machine We strongly believe that data has the power to help create great products and experiences which delight our customers. We believe that actionable and persistent insights, based on high quality data platform, help business and engineering make more impactful decisions. Our ambitions reach well beyond existing solutions, and we are in search of innovative individuals to join this Agile team. This is an exciting, fast-paced role which requires outstanding technical and organization skills combined with critical thinking, problem-solving and agile management tools to support team success Responsibilities 5+ years of experience in data engineering or software engineering, with at least 2 years focused on cloud data platforms (GCP preferred). Technical Skills: Proficient in Java, SpringBoot & Angular/React with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g.Angular, React, Node.js). Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Qualifications Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency.

Posted 1 month ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are hiring for Power BI Developer Should have basic knowledge in data warehouse concepts Develop and enhance Power BI reports and dashboards Experienced in data modelling in Power BI including MQuery and DAX Experienced in Power BI features like RLS incremental data load dataflow etc Should have good exposure working with diverse set of visuals and various data sources like SQL Server BigQuery etc Proficient in TSQL Job description Design analyze develop test and debug Power BI reports and dashboards to satisfy business requirements Ability to translate business requirements to technical solutions Collaborate with other teams within the organization and be able to devise the technical solution as it relates to the business technical requirements Experienced in Client communication Excellent communication skill Should be a team player Maintain documentation for all processes implemented Adhere to and suggest improvements to coding standards best practices and contribute to the improvement of these best practices. Experience:- 5 -12 years Location: Hyderabad /Bangalore Primary Skill – Power BI, SQL, DAX Working Days- Hybrid Joining time - Immediate-30 days If the above criteria are matching your profiles, please share your profile to Ritusmita.MatagajSingh@ltimindtree.com swathi.gangu@ltimindtree.com with below details. Relevant in Power Bi: Relevant in SQL: Current CTC: Expected CTC: Current Location: Preferred Location Offer in hand if any: Pan Card No: Notice period/how soon you can join: Regards Swathi LTIM

Posted 1 month ago

Apply

30.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Client Our client is a market-leading company with over 30 years of experience in the industry. As one of the world’s leading professional services firms, with $19.7B, with 333,640 associates worldwide, helping their clients modernize technology, reimagine processes, and transform experiences, enabling them to remain competitive in our fast-paced world. Their Specialties in Intelligent Process Automation, Digital Engineering, Industry & Platform Solutions, Internet of Things, Artificial Intelligence, Cloud, Data, Healthcare, Banking, Finance, Fintech, Manufacturing, Retail, Technology, and Salesforce Job Title: GCP Data Engineer Location : Gurgaon Experience : 7-9 Years Job Typ e: Contract to Hire Notice Period : Immediate Joiners Mandatory Skills: GCP; Big Data; ETL - Big Data / Data Warehousing Big query, data proc, data flow, composer Job description: Looking for GCP Developer with below mandatory skills and requirementsMandatory Skills:: BigQuery,Cloud Storage, Cloud Pub/Sub, Dataflow, Dataproc,Composer• 6+ years in cloud infrastructure and designing data pipeline, specifically in GCP• Proficiency in programming languages Python, SQL•Proven experience in designing and implementing cloud-native applications and microservices on GCP. •Hands-on experience with CI/CD tools like Jenkins and Github Action•In-depth understanding of GCP networking, IAM policies, and security best practices.

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred Technical And Professional Experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 1 month ago

Apply

5.0 - 9.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming

Posted 1 month ago

Apply

6.0 years

7 - 9 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Software Engineer . In this role, you will: Enhance & drive the overall product strategy, providing the vision and roadmap for the Data Analytics platform over Cloud journey and to help drive future requirements with reduced operational costs. Implementation of IT strategy to support core business objectives and gain business value. Become the ‘voice’ of the business within technology to ensure strategies are cohesive across all business streams. Identify interdependencies between various integrated teams and release plans. Accountable for identifying and resolving any alignment issues within the component teams delivered through the Global IT organization. Part of global team, consisting of 20+ resources across development and support. Creation & execution of plans to support training, adequate levels of resourcing to support the global demand Accountable for ensuring the products & services are delivered adhering to the approved architecture and solutions to meet the customer needs. Drive/Supporting technical design, change for new and existing data sources and manage support for delivering state of art intelligence infrastructure. Evolution of the DevOps model, ensuring continued improvement of the technology lifecycle and alignment with stakeholder plans Adhere to compliance with external regulatory requirements, internal control standards and group compliance policy. Maintains HSBC internal control standards, including timely implementation of internal and external audit points. Take accountability to work closely and build a trusted relationship with the business to ensure delivery of the benefits outlined by the respective strategy. Requirements To be successful in this role, you should meet the following requirements: Retail banking environment, with good understanding of customer lifecycle across core products 6+ years of Industry experience, solid exposure to managing/supporting product-based teams providing global services. Developing and maintaining ReactJS-based web applications: This includes creating new features, enhancing existing ones, and ensuring the overall functionality and user experience of the application. Writing clean, efficient, and reusable React components: This involves using best practices for component design, ensuring code readability, and creating components that can be used across multiple applications. Implementing state management: This involves using tools like Redux or Context API to manage the application's data and state efficiently. Ensuring cross-browser compatibility and mobile responsiveness: This means ensuring that the application looks and functions correctly across different browsers and devices. Optimizing application performance: This includes identifying and fixing performance bottlenecks, improving loading times, and ensuring a smooth user experience. Working closely with backend developers to integrate APIs: This involves collaborating with backend developers to define API endpoints, consume them in the frontend, and ensure seamless data flow. Following best practices in UI/UX design and front-end architecture: This involves understanding UI/UX principles, designing user-friendly interfaces, and structuring the codebase in a maintainable and scalable way. Staying updated with the latest ReactJS trends and features: This means continuously learning about new features and best practices in the ReactJS ecosystem. Performing unit testing and debugging for high-quality applications: This involves writing unit tests to ensure the quality of the code, debugging issues, and fixing bugs. Maintaining code quality, organization, and documentation: This involves writing clear and concise code, organizing the codebase, and documenting the code for future reference Skills : In-depth knowledge of JavaScript and ReactJS: This includes understanding core JavaScript concepts, React components, JSX, and state management. s Familiarity with other front-end technologies: This can include HTML, CSS, Bootstrap, and potentially other frameworks like Angular or VueJS. Experience with state management libraries: This can include Redux, Context API, or MobX. Understanding of front-end performance optimization techniques: This can include lazy loading, code splitting, and image optimization. Experience with version control systems (e.g., Git): This is essential for collaborating with other developers and managing the codebase. Good communication and collaboration skills: This is crucial for working with other developers, designers, and stakeholders. Problem-solving skills and ability to debug: This is essential for identifying and fixing issues in the codebase. Understanding of UI/UX design principles: This helps in creating user-friendly and intuitive interfaces. Ability to write clean, well-documented code: This makes the code easier to maintain and understand. Experience with front-end build tools (e.g., Webpack, Babel): These tools are used to automate tasks like bundling, transpiling, and minifying code. Strong proven experience in data migration projects over Cloud technologies like GCP/AWS, hands-on on Docker/Kubernetes Strong proven skills in Dataflow, Airflow, Big queries, Big Data Ecosystem including Hadoop and Cloud technologies. Strong knowledge of Data Warehousing, ETL, Analytics and Business Intelligence Reporting Experience of working in DevOps and Agile environment, strong knowledge, and experience of support tools like Jenkins, GIT, Nexus, Splunk, AppDynamics etc You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 1 month ago

Apply

1.0 years

0 Lacs

Cochin

On-site

We are seeking a proactive and results-driven Healthcare Recruiter to join our HR team in Kochi. The recruiter will be responsible for sourcing, screening, and hiring qualified healthcare professionals. Source healthcare professionals through job portals, social media, and campus drives Conduct telephonic and face-to-face interviews Coordinate with licensing teams for DataFlow, Prometric, and embassy processing Maintain candidate databases and follow up regularly Coordinate offer letters, onboarding, and deployment Collaborate with manpower agencies and vendors for bulk hiring needs Requirements Bachelor's degree in HR / Healthcare Management / Minimum 1 year experience in recruitment (preferably healthcare) Familiar with GCC licensing systems, (DataFlow, Prometric) is a plus Strong communication and interpersonal skills Ability to multitask and meet recruitment deadlines Excellent communication skills in English are mandatory Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Kochi, Kerala: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred) Education: Bachelor's (Preferred) Experience: HR: 1 year (Required) Work Location: In person

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Senior Software Developer Location: Chennai, India. About The Job: Developers at Vendasta work in teams, working with Product Managers and Designers in the creation of new features and products. Our Research and Development department works hard to help developers learn, grow, and experiment while at work. With a group of over 100 developers, we have fostered an environment that provides developers with the opportunity to continuously learn from each other. The ideal candidate will demonstrate that they are bright and can tackle tough problems while being able to communicate their solution to others. They are creative and can mix technology with the customer’s problems to find the right solution. Lastly, they are driven and will motivate themselves and others to get things done. As an experienced Software Developer, we expect that you will grow into a thought leader at Vendasta, driving better results across our development organization. Your Impact: Develop software in teams of 3-5 developers, with the ability to take on tasks for the team and independently work on them to completion. Follow best practices to write clean, maintainable, scalable, and tested software. Contribute to the best engineering practices, including the use of design patterns, CI/CD, maintainable and scalable code, code review, and automated tests. Provide inputs for a technical roadmap for the Product Area. Ensure that the NFRs and technical debt get their due focus. Work collaboratively with Product Managers to design solutions (including technical roadmap) that help our Partners connect digital solutions to small and medium-sized businesses. Analyzing and improving current system integrations and migration strategies. Interact and collaborate with our high-quality technical team across India and Canada What You Bring To The Table: 8+ years experience in a related field with at least 3+ years as full stack developer in an architect or senior development role Experience or strong understanding of high scalability, data-intensive, distributed Internet applications Software development experience including building distributed, microservice-style and cloud-based application architectures Proficiency in modern software language, and willingness to quickly learn our technology stack Preference will be given to candidates with a strong Go (programming language) experience, and who can demonstrate the ability to build and adapt web applications using Angular. Experience in designing, Building and Implementing cloud-native architectures (GCP preferred). Experience working with the Scrum framework Technologies We Use: Cloud Native Computing using Google Cloud Platform BigQuery, Cloud Dataflow, Cloud Pub/Sub, Google Data Studio, Cloud IAM, Cloud Storage, Cloud SQL, Cloud Spanner, Cloud Datastore, Google Maps Platform, Stackdriver, etc.… We have been invited to join the Early Access Program on quite a few GCP technologies. GoLang, Typescript, Python, JavaScript, HTML, Angular, GRPC, Kubernetes Elasticsearch, MySQL, PostgreSQL About Vendasta: So what do we do? We create an entire platform full of digital products & solutions that help small to medium-sized businesses (SMBs) have a stronger presence online through digital advertising, online listings, reputation management, website creation, social media marketing … and much more! Our platform is used exclusively by channel partners, who sell products and services to SMBs, allowing them to leverage us to scale and grow their business. We are trusted by 65,000+ channel partners, serving over 6 million SMBs worldwide! Perks: Stock options (as per policy) Benefits - Health insurance Paid time off Public transport reimbursement Flex days Training & Career Development - Professional development plans, leadership workshops, mentorship programs, and more! Free Snacks, hot beverages, and catered lunches on Fridays Culture - comprised of our core values: Drive, Innovation, Respect, and Agility Provident Fund

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree or four or more years of work experience. Four or more years of relevant work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in Big Data technologies - GCP/Hadoop/Spark/Composer/DataFlow/Bigquery. Experience in complex SQL. Experience working on Streaming ETL pipelines Expertise in Java Experience with MemoryStore / Redis / Spanner Experience in troubleshooting the data issues. Experience with data pipeline and workflow management & Governance tools. Knowledge of Information Systems and their applications to data management processes. Even better if you have one or more of the following… Three or more years of relevant experience. Any relevant Certification on ETL/ELT developer. Certification in GCP-Data Engineer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Experience in driving a small team of 2 or more members for technical delivery #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities: Job Title: Cloud Data Engineer (AWS/Azure/Databricks/GCP) Experience:2-4 years in Data Engineering Job Description: We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities: - Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. - Implement data ingestion and transformation processes to facilitate efficient data warehousing. - Utilize cloud services to enhance data processing capabilities: - AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. - Azure: Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. - GCP: Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. - Optimize Spark job performance to ensure high efficiency and reliability. - Stay proactive in learning and implementing new technologies to improve data processing frameworks. - Collaborate with cross-functional teams to deliver robust data solutions. - Work on Spark Streaming for real-time data processing as necessary. Qualifications: - 2-4 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. - Proven experience with data ingestion, transformation, and data warehousing. - In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): - Demonstrated ability in performance optimization of Spark jobs. - Strong problem-solving skills and the ability to work independently as well as in a team. - Cloud Certification (AWS, Azure, or GCP) is a plus. - Familiarity with Spark Streaming is a bonus. Mandatory skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Years of experience required: 2-4 years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Strategy {+ 22 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies