Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 7.0 years
2 - 4 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Your role and responsibilities Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 6 days ago
5.0 - 7.0 years
2 - 4 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Your role and responsibilities Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 week ago
5.0 - 7.0 years
5 - 7 Lacs
Chennai, Tamil Nadu, India
On-site
Position Summary As a Data Engineer III at Walmart, you will design, build, and maintain scalable data systems and architectures to enable advanced business intelligence and analytics. Your role will be pivotal in ensuring data quality, integrity, and security to support Walmart's data-driven decision-making and operational efficiency. You will collaborate closely with cross-functional teams to implement robust data solutions that drive business growth. About the Team The Walmart Last Mile Data team operates within the Data and Customer Analytics organization, focusing on optimizing Walmart's last mile delivery operations. This dynamic group uses cutting-edge data engineering and analytics technologies to improve delivery routing, reduce costs, and enhance customer satisfaction. What You'll Do Design, develop, and deploy data pipelines and integration solutions using technologies such as Spark, Scala, Python, Airflow, and Google Cloud Platform (GCP). Build scalable, efficient data processing systems while optimizing workflows to ensure high data quality and availability. Monitor, troubleshoot, and tune data pipelines to maximize reliability and performance. Partner with executive leadership, product, data, and design teams to address data infrastructure needs and technical challenges. Provide data scientists and analysts with well-structured data sets and tools to facilitate analytics and reporting. Develop analytics tools that contribute to Walmart's position as an industry leader. Stay current with data engineering trends and technologies, incorporating best practices to enhance data infrastructure. What You'll Bring Minimum 5 years of proven experience as a Data Engineer. Strong programming skills in Scala and experience with Spark for large-scale data processing and analytics. Expertise with Google Cloud Platform services such as BigQuery, Google Cloud Storage (GCS), and Dataproc. Experience building near real-time data ingestion pipelines using Kafka and Spark Structured Streaming. Solid knowledge of data modeling, data warehousing concepts, and ETL processes. Proficiency in SQL and NoSQL databases. Experience with version control systems, preferably Git. Familiarity with distributed computing frameworks and working with large-scale data sets. Understanding of CI/CD pipelines and tools such as Jenkins or GitLab CI. Experience with workflow schedulers like Airflow. Strong analytical and problem-solving abilities. Familiarity with BI and visualization tools such as Tableau or Looker. Experience or interest in Generative AI is a plus but not required. About Walmart Global Tech At Walmart Global Tech, your work impacts millions worldwide by simplifying complex challenges through technology. Join a diverse team of innovators shaping the future of retail, where people and technology come together to drive lasting impact. We support career growth with continuous learning and flexible hybrid work models. Minimum Qualifications Bachelor's degree in Computer Science or related field with at least 2 years of software engineering experience. OR 4 years of relevant software engineering or data engineering experience without a degree. OR Master's degree in Computer Science.
Posted 2 weeks ago
5.0 - 7.0 years
5 - 7 Lacs
Chennai, Tamil Nadu, India
On-site
Position Summary As a Data Engineer III at Walmart, you will design, build, and maintain scalable data systems and architectures to enable advanced business intelligence and analytics. Your role will be pivotal in ensuring data quality, integrity, and security to support Walmart's data-driven decision-making and operational efficiency. You will collaborate closely with cross-functional teams to implement robust data solutions that drive business growth. About the Team The Walmart Last Mile Data team operates within the Data and Customer Analytics organization, focusing on optimizing Walmart's last mile delivery operations. This dynamic group uses cutting-edge data engineering and analytics technologies to improve delivery routing, reduce costs, and enhance customer satisfaction. What You'll Do Design, develop, and deploy data pipelines and integration solutions using technologies such as Spark, Scala, Python, Airflow, and Google Cloud Platform (GCP). Build scalable, efficient data processing systems while optimizing workflows to ensure high data quality and availability. Monitor, troubleshoot, and tune data pipelines to maximize reliability and performance. Partner with executive leadership, product, data, and design teams to address data infrastructure needs and technical challenges. Provide data scientists and analysts with well-structured data sets and tools to facilitate analytics and reporting. Develop analytics tools that contribute to Walmart's position as an industry leader. Stay current with data engineering trends and technologies, incorporating best practices to enhance data infrastructure. What You'll Bring Minimum 5 years of proven experience as a Data Engineer. Strong programming skills in Scala and experience with Spark for large-scale data processing and analytics. Expertise with Google Cloud Platform services such as BigQuery, Google Cloud Storage (GCS), and Dataproc. Experience building near real-time data ingestion pipelines using Kafka and Spark Structured Streaming. Solid knowledge of data modeling, data warehousing concepts, and ETL processes. Proficiency in SQL and NoSQL databases. Experience with version control systems, preferably Git. Familiarity with distributed computing frameworks and working with large-scale data sets. Understanding of CI/CD pipelines and tools such as Jenkins or GitLab CI. Experience with workflow schedulers like Airflow. Strong analytical and problem-solving abilities. Familiarity with BI and visualization tools such as Tableau or Looker. Experience or interest in Generative AI is a plus but not required. About Walmart Global Tech At Walmart Global Tech, your work impacts millions worldwide by simplifying complex challenges through technology. Join a diverse team of innovators shaping the future of retail, where people and technology come together to drive lasting impact. We support career growth with continuous learning and flexible hybrid work models. Minimum Qualifications Bachelor's degree in Computer Science or related field with at least 2 years of software engineering experience. OR 4 years of relevant software engineering or data engineering experience without a degree. OR Master's degree in Computer Science.
Posted 2 weeks ago
5.0 - 7.0 years
5 - 7 Lacs
Chennai, Tamil Nadu, India
On-site
Position Summary As a Data Engineer III at Walmart, you will design, build, and maintain scalable data systems and architectures to enable advanced business intelligence and analytics. Your role will be pivotal in ensuring data quality, integrity, and security to support Walmart's data-driven decision-making and operational efficiency. You will collaborate closely with cross-functional teams to implement robust data solutions that drive business growth. About the Team The Walmart Last Mile Data team operates within the Data and Customer Analytics organization, focusing on optimizing Walmart's last mile delivery operations. This dynamic group uses cutting-edge data engineering and analytics technologies to improve delivery routing, reduce costs, and enhance customer satisfaction. What You'll Do Design, develop, and deploy data pipelines and integration solutions using technologies such as Spark, Scala, Python, Airflow, and Google Cloud Platform (GCP). Build scalable, efficient data processing systems while optimizing workflows to ensure high data quality and availability. Monitor, troubleshoot, and tune data pipelines to maximize reliability and performance. Partner with executive leadership, product, data, and design teams to address data infrastructure needs and technical challenges. Provide data scientists and analysts with well-structured data sets and tools to facilitate analytics and reporting. Develop analytics tools that contribute to Walmart's position as an industry leader. Stay current with data engineering trends and technologies, incorporating best practices to enhance data infrastructure. What You'll Bring Minimum 5 years of proven experience as a Data Engineer. Strong programming skills in Scala and experience with Spark for large-scale data processing and analytics. Expertise with Google Cloud Platform services such as BigQuery, Google Cloud Storage (GCS), and Dataproc. Experience building near real-time data ingestion pipelines using Kafka and Spark Structured Streaming. Solid knowledge of data modeling, data warehousing concepts, and ETL processes. Proficiency in SQL and NoSQL databases. Experience with version control systems, preferably Git. Familiarity with distributed computing frameworks and working with large-scale data sets. Understanding of CI/CD pipelines and tools such as Jenkins or GitLab CI. Experience with workflow schedulers like Airflow. Strong analytical and problem-solving abilities. Familiarity with BI and visualization tools such as Tableau or Looker. Experience or interest in Generative AI is a plus but not required. About Walmart Global Tech At Walmart Global Tech, your work impacts millions worldwide by simplifying complex challenges through technology. Join a diverse team of innovators shaping the future of retail, where people and technology come together to drive lasting impact. We support career growth with continuous learning and flexible hybrid work models. Minimum Qualifications Bachelor's degree in Computer Science or related field with at least 2 years of software engineering experience. OR 4 years of relevant software engineering or data engineering experience without a degree. OR Master's degree in Computer Science.
Posted 2 weeks ago
6.0 - 8.0 years
18 - 20 Lacs
Chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 2 weeks ago
5.0 - 10.0 years
15 - 20 Lacs
Chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,DataProc,Big Query,Data Flow,Composer,Data Processing,Java*
Posted 2 weeks ago
7.0 - 10.0 years
13 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
We are seeking skilled Salesforce Data Cloud Developer to join our dynamic IT team. This role involves developing and maintaining Salesforce Data Cloud solutions to enhance our mobile services and customer experience. The ideal candidate will have a strong background in Salesforce development, data integration, and cloud technologies. • Clear understanding on Legacy Sytems and databases and integrating them with SF ecosystem . Streamline data ingestion processes between Salesforce Data Cloud and our databases to ensure seamless data flow and accuracy. • Utilize Salesforce Data Cloud to gain deeper insights into customer behavior and preferences, driving personalized mobile experiences. • Implement AI-driven solutions and automation within Salesforce Data Cloud to optimize mobile service delivery and customer engagement. • Enhance mobile data strategies to support innovative mobile solutions and improve overall user experience. • Knowledge of AWS , GCS to support the integration with Salesforce , Thorough knowledge on Data cloud available connectors • Have Worked on DLO's , DMOs , SQL , segmentation and activation • Should have clear understanding on Integration of Data Cloud with Marketing cloud • Have worked on Data Cloud available connectors like Biq Query and GCS . Development: Design, develop, and implement custom Salesforce Data Cloud applications and enhancements tailored to organization requirement . • Integration: Perform data integration and migration tasks, ensuring data accuracy and integrity across Salesforce and other systems. • Collaboration: Work closely with cross-functional teams, including marketing and techincal to align Salesforce solutions with business objectives. • Documentation: Create and maintain comprehensive documentation on processes, policies, application configurations, and user guides. • Testing: Conduct thorough testing and debugging of Salesforce applications to ensure high performance and reliability. Bachelors Degree in Computer Science, Information Technology, Business, Engineering, or a related field • Minimum of 4-5 years of experience in Salesforce Eco-system and with at least 2-3 years of hands-on experience with Salesforce Data cloud • Strong ability to manage and communicate with both technical and non-technical stakeholders. • Solid understanding of software development, systems integration, and cloud technologies • Strong strategic thinking and planning skills. • Ability to work in a fast-paced, dynamic environment and manage multiple priorities • Experience with Agile methodologies and version control systems like Git MBA or advanced degree in a related field. • Salesforce Data Cloud Certification (e.g., SalesforceData cloud consultant) • Knowledge of Cloud environments like AWS, Azure, or Google Cloud Platform. • Experience in API integration with Legacy systems
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 316017 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Zabbix Administrator to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Zabbix Administration and Support Roles and responsibilities - In-depth knowledge of Enterprise Monitoring tool architecture, administration, and configuration. Technically manage the design and implementation of Zabbix tool. Hands on experience of end-to-end deployment. In-depth Knowledge of Systems Mgmt., Monitoring Tools, ITIL process, Integrations with different tools & scripting. Good understanding of Automation & Enterprise-wide monitoring tooling solutions. Hands on experience in integrating Enterprise Monitoring tools with ITSM platforms. Minimum (5) years hands on experience in administering and configuring Enterprise Monitoring tools at an L3 level. Having knowledge of IT Infra Programming/Scripting (Shell Jason, MySQL Python Perl) Good understanding of OS (Windows and Unix). Must have good knowledge of public cloud platforms (Azure, AWS, GCS) Install and configure software and hardware. Apply Zabbix patches and upgrades once available to upgrade the environment. Lead troubleshooting of issues and outages. Provide technical support as requested, for internal and external customers primarily for Zabbix Undertake individual assignments or work on a project as part of a larger team analyzing customer requirements, gathering and analyzing data and recommending solutions. Ensure assignments are undertaken consistently and with quality. Produce and update assignment documentation as required. Experienced in customer interaction. Good communication skills (verbal/written). Experienced in dealing with internal and external stake holders independently during transitions and project driven activities. Willing to work in 24 . 7 work environment. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 3 weeks ago
4 - 7 years
6 - 9 Lacs
Chennai
Work from Office
Skills : Google BigQuery, SQL, Python, Apache Airflow, Oracle to BigQuery DWH migration and modernization, DataProc, GCS, PySpark, Oracle DB and PL/SQL Required Candidate profile Notice Period: 0- 15 days Education: BE, B.tech, ME, M.tech
Posted 2 months ago
2 - 4 years
5 - 8 Lacs
Pune
Work from Office
We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.
Posted 2 months ago
4 - 7 years
18 - 22 Lacs
Pune
Work from Office
UKG is a leader in the HCM space, and is at the forefront of artificial intelligence innovation, dedicated to developing cutting-edge generative AI solutions that transform the HR / HCM industry and enhance user experiences. We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.
Posted 2 months ago
5 - 10 years
13 - 23 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Job Title: GCP Data Engineer (Senior / Lead / Architect/ Program Manager) Experience: 5 - 20 years Location : Chennai, Hyderabad, Bangalore Required Skills: GCP, Big Query, Cloud Storage, Dataflow, Python, Cloud Functions, Pub/Sub, Notice period: Immediate joiners Job Description: Experience leading, designing & developing Data Engineering solutions using Google Cloud Platform (GCP) Big Query, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, Cloud Composer (Airflow), Cloud Spanner, Bigtable etc. Experience building CI/CD pipelines to automate deployment and testing of data pipelines. Experience in managing and deploying containerized applications Proficient in Python for data processing and automation, SQL for querying and data manipulation. Experience with Cloud Monitoring, Datadog, or other monitoring solutions to track pipeline performance and ensure operational efficiency. Familiarity with Terraform or Deployment for Infrastructure as Code (IaC) to manage GCP resources will be plus.
Posted 3 months ago
6 - 11 years
0 - 1 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
5+ years of experience in Integration background with at least 4 years of CPI experience with end-to-end interface implementation experience. Integration Expert - experience in developing custom process integrations and Iflows. Experience in interfaces End to End with protocols REST, SOAP,RFC, HTTP, HTTPS, IDoc, JMS, Process direct,AMQP. Should have good knowledge in API Management. Should have good knowledge in handling security artifacts, encryption, decryption mechanisms, certificate authentication, PGP Good hands-on required to handle Process, Events, transformations, routings,data operations. Experience in GROOVY scripting and/or JAVA, basic knowledge in ABAP, Integration suite. Should have strong experience with a variety of adapter types such as: SFTP, SOAP, IDOC, Mail, XI, REST, ODATA. Experience with integration with S4Hana, GCS and ECC. Experience of Graphical Mapping and XSLT Mapping. Experience in debugging end to end integration & troubleshoot standard and custom integration. Basic understanding of BTP Global account. sub accounts / basic concept of BTP, cloud connector and destination set-up on BTP. Experience in handling different data conversions and data store operations. Experience in developing custom flows and handling standard flows
Posted 3 months ago
5 - 8 years
14 - 16 Lacs
Bengaluru
Remote
Hi all, We are hiring for the role Python & GCP engineer Experience: 5+ Years Location: Bangalore Notice Period: Immediate - 15 days Skills: Technical Expertise: Languages: Python, SQL, Shell scripting Big Data: Kafka, PySpark, data warehousing, data lakes Cloud Platforms: GCP: GCS, BigQuery, Pub/Sub, Dataproc, Dataflow, Cloud Functions Database: Schema design, optimization, stored procedures DevOps: CI/CD pipeline implementation, multi-cloud deployment automation Development: Parallel processing, streaming, low-level design If you are interested drop your resume at mojesh.p@acesoftlabs.com Call: 9701971793
Posted 3 months ago
12 - 17 years
35 - 60 Lacs
Chennai, Bengaluru
Hybrid
At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. Zoominfo is a rapidly growing data-driven company, and as such- we understand the importance of a comprehensive and solid data solution to support decision making in our organization. Our vision is to have a consistent, democratized, and accessible single source of truth for all company data analytics and reporting. Our goal is to improve decision-making processes by having the right information available when it is needed. As a Principal Software Engineer in our Data Platform infrastructure team you'll have a key role in building and designing the strategy of our Enterprise Data Engineering group. What You'll do: Design and build a highly scalable data platform to support data pipelines for diversified and complex data flows. Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and POC activities. Deliver scalable, reliable and reusable data solutions. Leading, building and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms. Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs. Develop processes and tools to monitor, analyze, maintain and improve data operation, performance and usability. What you bring: Relevant Bachelor degree or other equivalent Software Engineering background. 12+ years of experience as an infrastructure / data platform / big data software engineer. Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena. IaC design and hands-on experience. Familiarity designing CI/CD pipelines with Jenkins, Github Actions, or similar tools. Experience in designing, building and maintaining enterprise systems in a big data environment on public cloud. Strong SQL abilities and hands-on experience with SQL, performing analysis and performance optimizations. Hands-on experience in Python or equivalent programming language. Experience with administering data warehouse solutions (like Bigquery/ Redshift/ Snowflake). Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance. Experience with Airflow and DBT - advantage Experience with Kubernetes using GKE or EKS - advantage.. Experience with development practices Agile, TDD - advantage
Posted 3 months ago
2 - 4 years
5 - 8 Lacs
Pune
Work from Office
We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.
Posted 1 month ago
4 - 9 years
10 - 14 Lacs
Pune
Hybrid
Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.
Posted 1 month ago
4 - 9 years
10 - 14 Lacs
Pune
Hybrid
Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2