Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
8 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
Google Cloud Platform GCS, DataProc, Big Query, Data Flow Programming Languages Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform, GCS, DataProc, Big Query, Data Flow, Composer, Data Processing, Java
Posted 3 months ago
5.0 - 7.0 years
5 - 7 Lacs
Chennai, Tamil Nadu, India
On-site
Position Summary As a Data Engineer III at Walmart, you will design, build, and maintain scalable data systems and architectures to enable advanced business intelligence and analytics. Your role will be pivotal in ensuring data quality, integrity, and security to support Walmart's data-driven decision-making and operational efficiency. You will collaborate closely with cross-functional teams to implement robust data solutions that drive business growth. About the Team The Walmart Last Mile Data team operates within the Data and Customer Analytics organization, focusing on optimizing Walmart's last mile delivery operations. This dynamic group uses cutting-edge data engineering and analytics technologies to improve delivery routing, reduce costs, and enhance customer satisfaction. What You'll Do Design, develop, and deploy data pipelines and integration solutions using technologies such as Spark, Scala, Python, Airflow, and Google Cloud Platform (GCP). Build scalable, efficient data processing systems while optimizing workflows to ensure high data quality and availability. Monitor, troubleshoot, and tune data pipelines to maximize reliability and performance. Partner with executive leadership, product, data, and design teams to address data infrastructure needs and technical challenges. Provide data scientists and analysts with well-structured data sets and tools to facilitate analytics and reporting. Develop analytics tools that contribute to Walmart's position as an industry leader. Stay current with data engineering trends and technologies, incorporating best practices to enhance data infrastructure. What You'll Bring Minimum 5 years of proven experience as a Data Engineer. Strong programming skills in Scala and experience with Spark for large-scale data processing and analytics. Expertise with Google Cloud Platform services such as BigQuery, Google Cloud Storage (GCS), and Dataproc. Experience building near real-time data ingestion pipelines using Kafka and Spark Structured Streaming. Solid knowledge of data modeling, data warehousing concepts, and ETL processes. Proficiency in SQL and NoSQL databases. Experience with version control systems, preferably Git. Familiarity with distributed computing frameworks and working with large-scale data sets. Understanding of CI/CD pipelines and tools such as Jenkins or GitLab CI. Experience with workflow schedulers like Airflow. Strong analytical and problem-solving abilities. Familiarity with BI and visualization tools such as Tableau or Looker. Experience or interest in Generative AI is a plus but not required. About Walmart Global Tech At Walmart Global Tech, your work impacts millions worldwide by simplifying complex challenges through technology. Join a diverse team of innovators shaping the future of retail, where people and technology come together to drive lasting impact. We support career growth with continuous learning and flexible hybrid work models. Minimum Qualifications Bachelor's degree in Computer Science or related field with at least 2 years of software engineering experience. OR 4 years of relevant software engineering or data engineering experience without a degree. OR Master's degree in Computer Science.
Posted 3 months ago
6.0 - 8.0 years
18 - 20 Lacs
Chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 months ago
5.0 - 10.0 years
15 - 20 Lacs
Chennai
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,DataProc,Big Query,Data Flow,Composer,Data Processing,Java*
Posted 3 months ago
7.0 - 10.0 years
13 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
We are seeking skilled Salesforce Data Cloud Developer to join our dynamic IT team. This role involves developing and maintaining Salesforce Data Cloud solutions to enhance our mobile services and customer experience. The ideal candidate will have a strong background in Salesforce development, data integration, and cloud technologies. • Clear understanding on Legacy Sytems and databases and integrating them with SF ecosystem . Streamline data ingestion processes between Salesforce Data Cloud and our databases to ensure seamless data flow and accuracy. • Utilize Salesforce Data Cloud to gain deeper insights into customer behavior and preferences, driving personalized mobile experiences. • Implement AI-driven solutions and automation within Salesforce Data Cloud to optimize mobile service delivery and customer engagement. • Enhance mobile data strategies to support innovative mobile solutions and improve overall user experience. • Knowledge of AWS , GCS to support the integration with Salesforce , Thorough knowledge on Data cloud available connectors • Have Worked on DLO's , DMOs , SQL , segmentation and activation • Should have clear understanding on Integration of Data Cloud with Marketing cloud • Have worked on Data Cloud available connectors like Biq Query and GCS . Development: Design, develop, and implement custom Salesforce Data Cloud applications and enhancements tailored to organization requirement . • Integration: Perform data integration and migration tasks, ensuring data accuracy and integrity across Salesforce and other systems. • Collaboration: Work closely with cross-functional teams, including marketing and techincal to align Salesforce solutions with business objectives. • Documentation: Create and maintain comprehensive documentation on processes, policies, application configurations, and user guides. • Testing: Conduct thorough testing and debugging of Salesforce applications to ensure high performance and reliability. Bachelors Degree in Computer Science, Information Technology, Business, Engineering, or a related field • Minimum of 4-5 years of experience in Salesforce Eco-system and with at least 2-3 years of hands-on experience with Salesforce Data cloud • Strong ability to manage and communicate with both technical and non-technical stakeholders. • Solid understanding of software development, systems integration, and cloud technologies • Strong strategic thinking and planning skills. • Ability to work in a fast-paced, dynamic environment and manage multiple priorities • Experience with Agile methodologies and version control systems like Git MBA or advanced degree in a related field. • Salesforce Data Cloud Certification (e.g., SalesforceData cloud consultant) • Knowledge of Cloud environments like AWS, Azure, or Google Cloud Platform. • Experience in API integration with Legacy systems
Posted 3 months ago
5.0 - 7.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 316017 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Zabbix Administrator to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Zabbix Administration and Support Roles and responsibilities - In-depth knowledge of Enterprise Monitoring tool architecture, administration, and configuration. Technically manage the design and implementation of Zabbix tool. Hands on experience of end-to-end deployment. In-depth Knowledge of Systems Mgmt., Monitoring Tools, ITIL process, Integrations with different tools & scripting. Good understanding of Automation & Enterprise-wide monitoring tooling solutions. Hands on experience in integrating Enterprise Monitoring tools with ITSM platforms. Minimum (5) years hands on experience in administering and configuring Enterprise Monitoring tools at an L3 level. Having knowledge of IT Infra Programming/Scripting (Shell Jason, MySQL Python Perl) Good understanding of OS (Windows and Unix). Must have good knowledge of public cloud platforms (Azure, AWS, GCS) Install and configure software and hardware. Apply Zabbix patches and upgrades once available to upgrade the environment. Lead troubleshooting of issues and outages. Provide technical support as requested, for internal and external customers primarily for Zabbix Undertake individual assignments or work on a project as part of a larger team analyzing customer requirements, gathering and analyzing data and recommending solutions. Ensure assignments are undertaken consistently and with quality. Produce and update assignment documentation as required. Experienced in customer interaction. Good communication skills (verbal/written). Experienced in dealing with internal and external stake holders independently during transitions and project driven activities. Willing to work in 24 . 7 work environment. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 3 months ago
2 - 4 years
5 - 8 Lacs
Pune
Work from Office
We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.
Posted 4 months ago
4 - 9 years
10 - 14 Lacs
Pune
Hybrid
Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.
Posted 4 months ago
7.0 - 12.0 years
30 - 45 Lacs
bengaluru
Work from Office
About the Role We are looking for a seasoned Engineering Manager well-versed with emerging technologies to join our team. As an Engineering Manager, you will ensure consistency and quality by shaping the right strategies. You will keep an eye on all engineering projects and ensure all duties are fulfilled . You will analyse other employees tasks and carry on collaborations effectively. You will also transform newbies into experts and build reports on the progress of all projects. What you will do Design tasks for other engineers as per Meeshos guidelines Perform regular performance evaluation and share and seek feedback Keep a closer look on various projects and monitor the progress Carry on smooth collaborations with the sales team and design teams to innovate on new products Manage engineers and take ownership of the project while ensuring product scalability Conduct regular meetings to plan and develop reports on the progress of projects What you will need Bachelor's/Masters in computer science At least 7+ years professional experience At least 2 years of experience in managing software development teams Able to drive sprints and OKRs Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems Kafka Good experience on cloud infrastructure - AWS/GCS Good to have: Data pipelines, ES Exceptional team managing skills; experience in building large scale distributed Systems Experience in Scalable Systems Expertise in Java/Python and multithreading
Posted Date not available
4.0 - 8.0 years
8 - 14 Lacs
hyderabad, bengaluru
Work from Office
Looking for a GCP Data Engineer with 4+ yrs exp. Must have strong skills in Python, BigQuery, DataFlow, PubSub, GCS, Airflow, and DevOps. Experience in cloud migration, automation, and scripting (Python/Shell) required. GCP certification is a plus.
Posted Date not available
5.0 - 10.0 years
20 - 30 Lacs
hyderabad, pune, bengaluru
Work from Office
Designing, deploying, and managing applications and infrastructure on Google Cloud. Responsible for maintaining more solutions that leverage Google-managed or self-managed services, utilizing both the Google Cloud Console and command-line interface Required Candidate profile Designing and Implementing Cloud Solutions: Deploying and Managing Applications: Monitoring and Maintaining Cloud Infrastructure: Utilizing Cloud Services: Automation and DevOps:
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |