Jobs
Interviews

3291 Big Data Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

Develop data pipelines and applies methods and tools to collect, store, process and analyze complex data sets, globally for assigned operations or functions. Design, govern, build and operate solutions for large-scale data architectures and applications across businesses and functions. Select, manage and work hands-on with big data tools and frameworks, and implement ETL (extract, transform, load) tools and processes as well as data virtualization and federation services. Engineer data integration pipelines and reusable data services using cross-functional data models, semantic technologies and data integration solutions. Define, implement and apply data governance policies for all data flows of data architectures with focus on the digital platform and data lake. Define and implement policies for data ingestion, retention, lineage, access, data service API management and usage, in collaboration with data management and IT functions. Your Qualifications Graduate Degree in Computer Science, Applied Computer Science, Software Engineering 3 to 5 years

Posted 1 week ago

Apply

11.0 - 20.0 years

8 - 18 Lacs

Chennai, Bengaluru, PAN INDIA

Hybrid

Job Description Hiring for Big Data Lead Developer Experience Range: 5 to 18 years Mandatory Skills: Technology: Big Data Ecosystem (Hadoop, Spark, Kafka, Hive, HBase) Functional Programming: Scala Primary Skills: Technology Functional Programming Scala Big Data Data Processing: Apache Spark, Flink Data Storage: HDFS, Hive, Cassandra Data Streaming: Kafka, Storm Cloud Platforms: AWS, Azure, GCP DevOps Tools: Docker, Kubernetes, Jenkins Version Control: Git Education: BE/B.Tech/MCA/M.Tech/MSc./MSTS

Posted 1 week ago

Apply

5.0 - 9.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Primary Skill Set: Data Engineering, Python,Pyspark,Cloud (AWS/GCP), SCALA. Primary Skill: Snowflake, Cloud (AWS, GCP), SCALA, Python, Spark, Big Data and SQL. QUALIFICATION: Bachelors or masters degree JOB RESPONSIBILITY: strong development experience in Snowflake, Cloud (AWS, GCP), SCALA, Python, Spark, Big Data and SQL. Work closely with stakeholders, including product managers and designers, to align technical solutions with business goals. Maintain code quality through reviews and make architectural decisions that impact scalability and performance. Performs Root cause Analysis for any critical defects and address technical challenges, optimize workflows, and resolve issues efficiently. Expert in Agile, Waterfall Program/Project mplementation. Manages strategic and tactical relationships with program stakeholders. Successfully executing projects within strict deadlines while managing intense pressure. Good understanding of SDLC (Software Development Life Cycle) Identify potential technical risks and implement mitigation strategies Excellent verbal, written, and interpersonal communication abilities, coupled with strong problem-solving, facilitation, and analytical skills. Cloud Management Activities – To have a good understanding of the cloud architecture /containerization and application management on AWS and Kubernetes, to have in

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Hyderabad

Hybrid

Basic Qualifications: Bachelor's degree in computer science or equivalent 6 to 10 years experience in application development Willingness to learn and apply new technologies. Excellent communication skills are essential, with strong verbal and writing proficiencies. Good work ethic, self-starter, and results-oriented Excellent problem-solving & troubleshooting skills Ability to manage multiple priorities efficiently and effectively within specific timeframes Strong hands-on development experience in C#, python Strong hands on experience in building large scale solutions using big data technology stack like Spark and microservice architecture and tools like Docker and Kubernetes. Experience in conducting application design and code reviews Able to demonstrate strong OOP skills Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience implementing Web Services Have experience working with SQL Server. Ability to write stored procedures, triggers, performance tuning etc Experience working in cloud computing environments such as AWS Prefered Qualifications: Experience with large scale messaging systems such as Kafka is a plus Experience working with Big data technologies like Elastic Search, Spark is a plus Experience working with Snowflake is a plus Experience with Linux based environment is a plus

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Required Skills: Successful candidates will have demonstrated the following skills and characteristics - Must-Have (Senior Associate - CPG & Retail Analytics) Hands-on experience with CPG & Retail data assets such as point-of-sale (POS) transactions, loyalty-program histories, syndicated panel data (NielsenIQ, Circana/IRI, Kantar), e-commerce clickstream, and supply-chain sensors/IoT. Deep understanding of predictive modeling for purchase propensity, demand forecasting, promotion uplift, assortment, and customer lifetime value in omnichannel retail environments. Proficiency in machine learning techniquesclassification, regression, clustering, recommendation engines, and advanced time-series forecasting applied to sales and inventory data. Strong command of statistical methods including A/B & multivariate testing, price-elasticity modeling, market-basket analysis, segmentation, and causal-impact evaluation. Expert programming in Python or R, plus SQL with retail-specific data-wrangling libraries (pandas, dbt) across cloud warehouses (BigQuery, Redshift, Synapse). Hands on BI fluency in developing interactive visualizations in Power BI, Tableau, or Looker that turn SKU-level data into actionable stories for category, trade-marketing, and supply-chain teams. Exposure to big data & real-time analytics stacks used in retail (Spark, Kafka, Delta Lake, Snowflake Streaming, Google Vertex AI). Ability to translate insights for merchandising, marketing, supply-chain, and store-operations stakeholders—communicating clearly with both technical data teams and business leaders. Nice to have Exposure to ML engineering (Azure ML, AWS SageMaker, GCP Vertex AI) Experience blending retail-media impression/click logs with sales to calculate ROAS and incrementality, or to build MMM-lite dashboards for brand teams. Roles and Responsibilities Assist analytics projects within the CPG & Retail domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

punjab

On-site

Job Summary As a Technical Support Engineer (TSE) in the Education industry, you will serve as a Subject Matter Expert for a portfolio of enterprise accounts. Your primary responsibilities will include addressing technical inquiries in standard and custom deployment environments, aiding in supported LTS upgrades, conducting peer training, advancing personal education, and contributing to reference documentation. Collaboration with Support leadership, Engineering, Product, and Accounts teams is essential to ensure customers receive a valuable enterprise experience. You should demonstrate a high level of proficiency in both SEP and Galaxy, working independently with minimal supervision. Responsibilities Technical Support: - Support standard and custom deployments - Address technical queries through SFDC ticketing system - Reproduce reported issues, identify root causes, and offer solutions - Report SEP and Galaxy bugs in Jira and feature requests in Aha! LTS Upgrades: - Assist with upgrades upon customer request - Ensure customers are on a supported LTS version - Communicate unsupported LTS requests to the Account team Monthly Technical Check-ins: - Conduct scheduled technical check-ins with Business Units - Review open support tickets, update on product bugs, and recommend best practices - Ensure customer environments are on supported LTS versions Knowledge Sharing/Technical Enablement: - Contribute to reference documentation - Provide peer training - Assist content teams - Pursue personal technical education Project Involvement: - Contribute to departmental and cross-functional initiatives - Collaborate with Leadership to identify and address operational inefficiencies - Provide feedback on educational opportunities and project ideas Requirements - 5+ years of support experience - 3+ years of experience in Big Data, Docker, Kubernetes, and cloud technologies Skills - Proficiency in Big Data technologies (Teradata, Hadoop, Data Lakes, Spark) - Experience with Docker and Kubernetes - Knowledge of cloud technologies (AWS, Azure, GCP) - Familiarity with security protocols such as LDAP, OAuth2.0, SSL/TLS - Linux skills - Understanding of DBMS concepts and SQL - Proficiency in SQL, Java, Python, and Bash Benefits Why Join Us - Engage with a globally recognized team on innovative AI/ML projects - Embrace a culture that values curiosity, continuous learning, and impact - Collaborate with Fortune 500 clients and industry leaders - Opportunity to grow your career by becoming an instructor and sharing knowledge worldwide,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chandigarh

On-site

You will be joining the Microsoft Security organization, where security is a top priority due to the increasing digital threats, regulatory scrutiny, and complex estate environments. Microsoft Security aims to make the world a safer place by providing end-to-end security solutions to empower users, customers, and developers. As a Senior Data Scientist, you will be instrumental in enhancing our security posture by developing innovative models to detect and predict security threats. This role requires a deep understanding of data science, machine learning, and cybersecurity, along with the ability to analyze large datasets and collaborate with security experts to address emerging threats and vulnerabilities. Your responsibilities will include understanding complex cybersecurity and business problems, translating them into well-defined data science problems, and building scalable solutions. You will develop and deploy production-grade AI/ML systems for real-time threat detection, analyze large datasets to identify security risks, and collaborate with security experts to incorporate domain knowledge into models. Additionally, you will lead the design and implementation of data-driven security solutions, mentor junior data scientists, and communicate findings to stakeholders. To qualify for this role, you should have experience in developing and deploying machine learning models for security applications, preferably in a Big Data or cybersecurity environment. You should be familiar with the Azure tech stack, have knowledge of anomaly detection and fraud detection, and possess expertise in programming languages such as Python, R, or Scala. A Doctorate or Master's Degree in a related field, along with 5+ years of data science experience, is preferred. Strong analytical, problem-solving, and communication skills are essential, as well as proficiency in machine learning frameworks and cybersecurity principles. Preferred qualifications include additional experience in developing machine learning models for security applications, familiarity with data science workloads on the Azure tech stack, and contributions to the field of data science or cybersecurity. Your ability to drive large-scale system designs, think creatively, and translate complex data into actionable insights will be crucial in this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a member of the Infosys consulting team, your main responsibility will be to address customer issues, identify problem areas, create innovative solutions, and oversee implementation to ensure client satisfaction. You will be involved in developing proposals, contributing to solution design, configuring products, conducting pilot sessions, and resolving queries related to requirements and design. Additionally, you will conduct product demonstrations, Proof of Concept (POC) workshops, and provide effort estimates in alignment with customer budgetary constraints and organizational financial guidelines. Leading small projects and engaging in unit-level and organizational initiatives to deliver high-quality, value-added solutions to customers will also be part of your role. Your technical expertise should encompass Big Data technologies, specifically Big Table, Cloud Integration, including Azure Data Factory (ADF), and Data on Cloud platforms like AWS. Furthermore, you are expected to devise strategies and models that drive innovation, growth, and profitability for clients, possess knowledge of software configuration management systems, stay updated on emerging technologies and industry trends, exhibit logical thinking, problem-solving skills, and collaborate effectively. Familiarity with financial processes across project types, pricing models, ability to identify process enhancements, suggest technology solutions, and demonstrate client interfacing, project management, and team management skills are essential. Preferred skills include proficiency in Google Cloud Platform (GCP) for Big Data, Azure Data Factory (ADF) for Cloud Integration, and AWS for Data on Cloud platforms. If you are passionate about guiding clients through their digital transformation journey, this opportunity at Infosys is tailored for you.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Data Analyst at Infosys, you will play a crucial role in collecting, cleaning, analyzing, and interpreting large datasets from various sources. Your insights will drive informed decisions across the organization, and your ability to effectively communicate data findings to stakeholders at all levels will be essential. Key Responsibilities: - Collect, clean, and organize large datasets to extract valuable insights - Utilize statistical methods, machine learning techniques, and data visualization tools for data analysis - Identify patterns, trends, and anomalies within datasets to uncover valuable insights - Develop and maintain data models that accurately represent the organization's business operations - Create interactive dashboards and reports to effectively communicate data findings to stakeholders - Document data analysis procedures and findings to facilitate knowledge transfer within the organization Technical Requirements: - Minimum 5 years of experience as a Data Analyst or in a similar role - Proven track record of collecting, cleaning, analyzing, and interpreting large datasets - Expertise in pipeline designing and validation Additional Responsibilities: - Demonstrate a high degree of initiative and flexibility - Maintain a strong customer orientation - Uphold high quality awareness in all data analysis processes - Possess excellent verbal and written communication skills - Exhibit logical thinking, problem-solving abilities, and a collaborative mindset - Have knowledge of two or three industry domains - Understand financial processes for different project types and pricing models - Demonstrate client interfacing skills - Familiarity with SDLC and agile methodologies - Experience in project and team management Preferred Skills: - Proficiency in Python for big data analytics - Familiarity with Google Cloud Platform services, including Google Container Registry and Google Big Data - Experience in reporting, analytics, and visualization tools like Pentaho Reporting Join us at Infosys where our culture of innovation and empowerment will inspire you to build the future together. Your career will never stand still as we navigate further together, leveraging our values, trusted relationships, and commitment to learnability.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

kochi, kerala

On-site

As a Data Scientist at our company, you will be responsible for delivering end-to-end projects in the Analytics space. You must possess the skills in Big Data, Python or R to drive business results through data-based insights. Your role will involve working with various stakeholders and functional teams, discovering solutions in large data sets, and improving business outcomes. Identifying valuable data sources and collection processes will be a key part of your responsibilities. You will oversee the preprocessing of both structured and unstructured data, analyze large volumes of information to identify trends and patterns specific to the insurance industry. Building predictive models, machine-learning algorithms, and ensemble modeling will be essential tasks. You will also be required to present information using data visualization techniques and collaborate with engineering and product development teams. To excel in this role, you should have 3.5-5 years of experience in Analytics systems/program delivery, with a minimum of 2 project implementations in Big Data or Advanced Analytics. Proficiency in statistical computer languages such as R, Python, SQL, and Pyspark is necessary. Familiarity with Scala, Java, or C++ and knowledge of various machine learning techniques and advanced statistical concepts are also required. Hands-on experience with Azure/AWS analytics platforms, Databricks or similar analytical applications, business intelligence tools like Tableau, and data frameworks such as Hadoop is crucial. Strong mathematical skills, excellent communication, and presentation abilities are essential for this role. In addition to technical skills, you should have multi-industry domain experience, expertise in Python, Scala, SQL, and knowledge of Tableau/Power BI or similar self-service visualization tools. Your interpersonal and team skills should be exemplary, and any past leadership experience would be advantageous. Join our team at Accenture and leverage your expertise to drive impactful business outcomes through data-driven insights. Experience: 3.5 - 5 years of relevant experience required Educational Qualification: Graduation,

Posted 1 week ago

Apply

15.0 - 21.0 years

0 Lacs

haryana

On-site

The Data Architecture Specialist Join our team of data architects who design and execute industry-relevant reinventions that allow organizations to realize exceptional business value from technology. Practice: Technology Strategy & Advisory, Capability Network I Areas of Work: Data Architecture | Level: Sr Manager | Location: Bangalore/Mumbai/Pune/Gurugram | Years of Exp: 15 to 21 years Explore an Exciting Career at Accenture Are you a problem solver and passionate about Tech-driven transformation Do you want to design, build and implement strategies to enhance business architecture performance Are you passionate about being part of an inclusive, diverse and collaborative culture Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory team helps clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing Technology Strategy and Advisory team, you will work closely with our clients to unlock the value of data, architecture, and AI to drive business agility and transformation to a real-time enterprise. As a leading Data Architecture Consulting professional, you will work on the following areas: - Business Problem Data Analysis: Identifying, assessing, and solving complex business problems using in-depth evaluation of variable factors. - Technology-driven journey intersection: Helping clients design, architect and scale their journey to new technology-driven growth. - Architecture Transformation: Helping solve key business problems by enabling an architecture transformation, from the current state to a to-be enterprise environment. - High Performance Growth and Innovation: Assisting our clients to build the required capabilities for growth and innovation to sustain high performance. Bring your best skills forward to excel at the role: - Present data strategy and develop technology solutions and value adding propositions to drive C-suite/senior leadership level discussions. - Capitalize on in-depth understanding of the latest technologies such as big data, data integration, data governance, data quality, cloud platforms, data modelling tools, data warehouse and hosting environments. - Lead proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains. - Maximize subject matter expertise on data-led projects and play a key role in pitches where data-based RFP responses are discussed. - Demonstrate ability to work creatively and analytically in a problem-solving environment. - Use knowledge of key value drivers of a business, how they impact the scope and approach of the engagement. - Develop client handling skills to develop, manage and deepen relationships with key stakeholders. - Leverage team building skills to collaborate, work and motivate teams with diverse skills and experience to achieve goals. - Build on leadership skills along with strong communication, problem solving, organizational and delegation skills to nurture and inspire team members. Your experience counts! - MBA from a tier 1 institute. - Your prior experience in one or more of the following is important: - Assessment of Information Strategy Maturity and evaluation of new IT potential with a focus on data monetization, platforms, customer 360 view and analytics strategy. - Defining data-based strategy and establishing to-be Information Architecture landscape. - Design of cutting-edge solutions using cloud platforms like AWS, Azure, GCP, etc. and conceptualization of Data models. - Establish framework for effective Data Governance and define data ownership, standards, policies, and associated processes. - Product/ Framework/ Tools evaluation: Collaborating with business experts for business understanding, work with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. - Evaluate existing products and frameworks and develop options for proposed solutions. - Practical industry expertise: The areas of Financial Services, Retail, Telecommunications, Life Sciences, Mining and Resources are of interest but experience in equivalent domains is also welcomed. Consultants should understand the key technology trends in their domain and the related business implications.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

KLA is a global leader in diversified electronics for the semiconductor manufacturing ecosystem. Virtually every electronic device in the world is produced using our technologies. No laptop, smartphone, wearable device, voice-controlled gadget, flexible screen, VR device or smart car would have made it into your hands without us. KLA invents systems and solutions for the manufacturing of wafers and reticles, integrated circuits, packaging, printed circuit boards and flat panel displays. The innovative ideas and devices that are advancing humanity all begin with inspiration, research and development. KLA focuses more than average on innovation and we invest 15% of sales back into R&D. Our expert teams of physicists, engineers, data scientists and problem-solvers work together with the worlds leading technology providers to accelerate the delivery of tomorrows electronic devices. Life here is exciting and our teams thrive on tackling really hard problems. There is never a dull moment with us. The Information Technology (IT) group at KLA is involved in every aspect of the global business. ITs mission is to enable business growth and productivity by connecting people, process, and technology. It focuses not only on enhancing the technology that enables our business to thrive but also on how employees use and are empowered by technology. This integrated approach to customer service, creativity and technological excellence enables employee productivity, business analytics, and process excellence. As a Sr. data engineer, part of Data Sciences and Analytics team, you will play a key role for KLAs Data strategy principles and technique. As part of the centralized analytics team we will help analyze and find key data insights into various business unit processes across the company. You will be providing key performance indicators, dashboards to help various business users/partners to make business critical decisions. You will craft and develop analytical solutions by capturing business requirements and translating them into technical specifications, building data models and data visualizations. Responsibilities: - Design, develop and deploy Microsoft Fabric solutions, Power BI reports and dashboards. - Collaborate with business stakeholders to gather requirements and translate them into technical specifications. - Develop data models and establish data connections to various data sources. Use expert knowledge on Microsoft fabric architecture, deployment, and management. - Optimize Power BI solutions for performance and scalability. - Implement best practices for data visualization and user experience. - Conduct code reviews and provide mentorship to junior developers. - Manage permissions and workspaces in Power BI ensuring secure and efficient analytics platform. - Conduct assessments and audits of existing Microsoft Fabric environments to identify areas for improvement. - Stay current with the latest Fabric and Power BI features and updates. - Troubleshoot and resolve issues related to Fabric objects, Power BI reports and data sources. - Create detailed documentation, including design specifications, implementation plans, and user guides. Minimum Qualifications: - Doctorate (Academic) Degree and 0 years related work experience; Master's Level Degree and related work experience of 3 years; Bachelor's Level Degree and related work experience of 5 years - Proven experience as a Power BI Developer, with a strong portfolio of Power BI projects. - In-depth knowledge of Power BI, including DAX, Power Query, and data modeling. - Experience with SQL and other data manipulation languages. - In-depth knowledge of Microsoft Fabric and Power BI, including its components and capabilities. - Strong understanding of Azure cloud computing, data integration, and data management. - Excellent problem-solving skills and the ability to work independently and as part of a team. - Excellent Technical Problem Solving skill, performance optimization skills - Specialist in SQL and Stored procedures with Data warehouse concepts - Performed ETL Processes (Extract, Load, Transform). - Exceptional communication and interpersonal skills. - Expert knowledge of cloud and big data concepts and tools. Azure, AWS, Data Lake, Snowflake, etc. Nice to have: - Extremely strong SQL skills - Foundational knowledge of Metadata management, Master Data Management, Data Governance, Data Analytics - Good to have technical knowledge on Data Bricks/Data Lake/Spark/SQL. - Experience in configuring SSO(Single Single-On), RBAC, security roles on an analytics platform. - SAP functional knowledge is a plus - Microsoft certifications related to Microsoft Fabric/Power BI or Azure/analytics are a plus. - Good understanding of requirements and converting them into data warehouse solutions We offer a competitive, family friendly total rewards package. We design our programs to reflect our commitment to an inclusive environment, while ensuring we provide benefits that meet the diverse needs of our employees. KLA is proud to be an equal opportunity employer Be aware of potentially fraudulent job postings or suspicious recruiting activity by persons that are currently posing as KLA employees. KLA never asks for any financial compensation to be considered for an interview, to become an employee, or for equipment. Further, KLA does not work with any recruiters or third parties who charge such fees either directly or on behalf of KLA. Please ensure that you have searched KLAs Careers website for legitimate job postings. KLA follows a recruiting process that involves multiple interviews in person or on video conferencing with our hiring managers. If you are concerned that a communication, an interview, an offer of employment, or that an employee is not legitimate, please send an email to talent.acquisition@kla.com to confirm the person you are communicating with is an employee. We take your privacy very seriously and confidentially handle your information.,

Posted 1 week ago

Apply

0.0 - 3.0 years

0 Lacs

chennai, tamil nadu

On-site

As a member of the US Omni tech team at Walmart Global Tech, you will be playing a crucial role in enhancing the quality of Catalog data in the fast-growing E-Commerce sector. Your responsibilities will include analyzing data to identify gaps, recommending solutions, and collaborating with cross-functional teams to drive operational decisions. Effective communication with stakeholders, building SOPs, template management, and ensuring adherence to quality processes will be key components of your role. You will proactively address item-related issues reported by Merchants and Suppliers, independently handle complex problems, and work towards eliminating process redundancies. Your proficiency in Microsoft Office applications, strong analytical skills, and ability to bring operational efficiencies by following best practices will be essential for success in this role. The ideal candidate will hold a bachelor's degree with 0-3 years of experience in the Retail/E-Commerce industry, possess excellent English communication skills, and be adept at email etiquette. Flexibility to work in multiple shifts, along with technical skills such as system administration concepts, familiarity with ticketing systems, and basic scripting knowledge will be advantageous. Experience with cloud platforms and data querying tools will also be beneficial. At Walmart Global Tech, you will be part of a dynamic team that leverages technology to make a significant impact on the retail industry. You will have the opportunity to grow your skills, collaborate with experts, and drive innovation at scale. The hybrid work model at Walmart allows for a mix of in-office and virtual presence, providing flexibility and enabling quick decision-making. Apart from a competitive compensation package, you will have access to incentive awards, best-in-class benefits, and a supportive work culture that values diversity and inclusion. By fostering a workplace where everyone feels included, Walmart aims to create opportunities for associates, customers, and suppliers globally. Join us at Walmart Global Tech to be a part of a team that is shaping the future of retail and making a positive impact on millions of lives.,

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Gurugram

Work from Office

Job Description - The candidate should have extensive production experience (2+ Years ) in GCP - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must. Roles & Responsibilities 4-10 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.

Posted 1 week ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Vadodara

Work from Office

Responsibilities - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Required Skills - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Skills - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

2 - 6 Lacs

Vadodara

Work from Office

Key Responsibilities : - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Requirements : - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Qualifications : - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Patna

Work from Office

Responsibilities - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Required Skills - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Skills - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Nagpur

Work from Office

Responsibilities - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Required Skills - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Skills - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Pune

Work from Office

Responsibilities - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Required Skills - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Skills - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

2 - 6 Lacs

Patna

Work from Office

Key Responsibilities : - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Requirements : - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Qualifications : - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Pimpri-Chinchwad

Work from Office

Responsibilities - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Required Skills - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Skills - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

2 - 6 Lacs

Pimpri-Chinchwad

Work from Office

Key Responsibilities : - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Requirements : - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Qualifications : - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Thane

Work from Office

Responsibilities - Conduct feature engineering, data analysis, and data exploration to extract valuable insights. - Develop and optimize Machine Learning models to achieve high accuracy and performance. - Design and implement Deep Learning models, including Artificial Neural Networks (ANN), Convolutional Neural Networks (CNN), and Reinforcement Learning techniques. - Handle real-time imbalanced datasets and apply appropriate techniques to improve model fairness and robustness. - Deploy models in production environments and ensure continuous monitoring, improvement, and updates based on feedback. - Collaborate with cross-functional teams to align ML solutions with business goals. - Utilize fundamental statistical knowledge and mathematical principles to ensure the reliability of models. - Bring in the latest advancements in ML and AI to drive innovation. Required Skills - 4-5 years of hands-on experience in Machine Learning and Deep Learning. - Strong expertise in feature engineering, data exploration, and data preprocessing. - Experience with imbalanced datasets and techniques to improve model generalization. - Proficiency in Python, TensorFlow, Scikit-learn, and other ML frameworks. - Strong mathematical and statistical knowledge with problem-solving skills. - Ability to optimize models for high accuracy and performance in real-world scenarios. Preferred Skills - Experience with Big Data technologies (Hadoop, Spark, etc.) - Familiarity with containerization and orchestration tools (Docker, Kubernetes). - Experience in automating ML pipelines with MLOps practices. - Experience in model deployment using cloud platforms (AWS, GCP, Azure) or MLOps tools.

Posted 1 week ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Design and optimize batch/streaming data pipelines using Scala, Spark, and Kafka Implement real-time tokenization/cleansing microservices in Java Manage production workflows via Apache Airflow (batch scheduling) Conduct root-cause analysis of data incidents using Spark/Dynatrace logs Monitor EMR clusters and optimize performance via YARN/Dynatrace metrics Ensure data security through HashiCorp Vault (Transform Secrets Engine) Validate data integrity and configure alerting systems Requirements Technical Requirements Programming :Scala (Spark batch/streaming), Java (real-time microservices) Big Data Systems: Apache Spark, EMR, HDFS, YARN resource management Cloud & Storage :Amazon S3, EKS Security: HashiCorp Vault, tokenization vs. encryption (FPE) Orchestration :Apache Airflow (batch scheduling) Operational Excellence Spark log analysis, Dynatrace monitoring, incident handling, data validation Mandatory Competencies Expertise in distributed data processing (Spark on EMR/Hadoop) Proficiency in shell scripting and YARN job management Ability to implement format-preserving encryption (tokenization solutions) Experience with production troubleshooting (executor logs, metrics, RCA)

Posted 1 week ago

Apply

3.0 - 8.0 years

50 - 55 Lacs

Hyderabad

Work from Office

Are you interested in building high-performance, scalable financial systems that support Amazons current and future growthAre you looking for ways to invent newer and simpler ways of building solutionsIf so, we are looking for you to fill a challenging position in Amazon Finance Technology team. Amazon Finance Technology Team is looking for a Software Development Engineer, who can help us create the next generation of distributed, scalable financial systems. Our ideal candidate thrives in a fast-paced environment, enjoys the challenge of highly complex business contexts that are typically being defined in real-time. We need someone to design and develop services that facilitate global financial transactions worth billions of dollars annually. As a Software Development Engineer, you will help solve a variety of technical challenges and build highly scalable solutions to solve unique problems for worldwide accounting/finance teams. You will work on big data problems making use of AWS services, design enterprise scaled systems, develop and deploy highly scalable and reliable distributed services. You will tackle challenging, novel situations every day. Along the way, we guarantee that you will learn a ton, have fun and make a huge impact. 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Bachelors degree or equivalent 3+ years of programming with at least one software programming language experience 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies