Home
Jobs

2384 Hive Jobs - Page 37

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Linkedin logo

Job Description Role and Responsibilities: Emphasis is on end-to-end delivery of analysis Extremely comfortable working with data, including managing large number of data sources, analyzing data quality, and pro-actively working with client’s data/ IT teams to resolve issues Hands on experience on Machine Learning algorithms such as Logistic Regression, Random Forest, XG Boost Use variety of analytical tools (Python, SQL, PySpark etc.) to carry out analysis and drive conclusions Reformulate highly technical information into concise, understandable terms for presentations Candidate Profile Required skills: Python, SQL, Hive, PySpark, Hadoop, Machine Learning, Credit Risk Modeling Experience with Debt Recovery and collections model would be a plus 2- 5 years of consulting, analytics delivery experience Experience in Banking and Financial Services domain Master’s or Bachelor's degree in math, statistics, economics, computer engineering or related analytics field Very strong analytical skills with the demonstrated ability to research and make decisions based on the day-to-day and complex customer problems required Strong record of achievement, solid analytical ability, and an entrepreneurial hands-on approach to work Outstanding written and verbal communication skills Job Location 2 days work from Office, 3 days work from home Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Pune, Chennai, Mumbai (All Areas)

Hybrid

Naukri logo

Hello Connections , Exciting Opportunity Alert !! We're on the hunt for passionate individuals to join our dynamic team as Data Engineer Job Profile : Data Engineers Experience : Minimum 6 to Maximum 9 Yrs of exp Location : Chennai / Hyderabad / Bangalore / Gurgaon / Pune Mandatory Skills : Big Data | Hadoop | pyspark | spark | sparkSql | Hive Qualification : B.TECH / B.E / MCA / Computer Science Background - Any Specification How to Apply? Send your CV to: sipriyar@sightspectrum.in Contact Number - 6383476138 Don't miss out on this amazing opportunity to accelerate your professional career! #bigdata #dataengineer#hadoop#spark #python #hive #pysaprk

Posted 1 week ago

Apply

4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

4+ years of experience as a Data Engineer or similar role. Proficiency in Python, PySpark, and advanced SQL. Hands-on experience with big data tools and frameworks (e.g., Spark, Hive). Experience with cloud data platforms like AWS, Azure, or GCP is a plus. Solid understanding of data modeling, warehousing, and ETL processes. Strong problem-solving and analytical skills. Good communication and teamwork abilities.Design, build, and maintain data pipelines that collect, process, and store data from various sources. Integrate data from multiple heterogeneous sources such as databases (SQL/NoSQL), APIs, cloud storage, and flat files. Optimize data processing tasks to improve execution efficiency, reduce costs, and minimize processing times, especially when working with large-scale datasets in Spark. Design and implement data warehousing solutions that centralize data from multiple sources for analysis. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title: Senior Data Engineer Number of Open Roles: 1 Location: Noida Experience: 5+ years About the Company & Role: We are one of India’s foremost political consulting firms, leveraging the power of data to drive impactful, 360-degree election campaigns. Our unique approach brings together ground intelligence, data engineering, and strategic insight to shape electoral narratives and legislative landscapes. As a Senior Data Engineer, you will play a key leadership role in building robust, scalable, and high-performance data architectures that power our analytics and campaign strategies. This is an opportunity to drive large-scale data initiatives, mentor junior engineers, and work closely with cross-functional teams to build systems that influence real-world democratic outcomes. What You'll Do: ● Architect, design, and manage scalable data pipelines for structured and unstructured data. ● Build and maintain data lakes, data warehouses, and ETL frameworks across cloud and on-prem platforms. ● Lead the modernization of our data infrastructure and migrate legacy systems to scalable cloud-native solutions. ● Collaborate with analysts, developers, and campaign strategists to ensure reliable data availability and quality. ● Drive implementation of best practices for data governance, access control, observability, and documentation. ● Guide and mentor junior data engineers and help foster a culture of excellence and innovation. ● Evaluate and implement cutting-edge tools and technologies to improve system performance and efficiency. ● Own and ensure end-to-end data reliability, availability, and scalability. Key Requirements: ● 5+ years of experience in Data Engineering with a proven track record of building production-grade data systems. ● Deep expertise in Python and SQL (advanced level). ● Strong experience with Big Data tools such as Airflow, Hadoop, Spark, Hive, Presto, etc. ● Hands-on experience with data lake and warehouse architectures (e.g., Delta Lake, Snowflake, BigQuery, Redshift). ● Proven experience with ETL/ELT design, data modeling (star/snowflake schema), and orchestration tools. ● Proficient in working with cloud platforms like AWS, GCP, or Azure. ● Solid understanding of CI/CD pipelines, Git workflows, and containerization (Docker). ● Excellent knowledge of data security, privacy regulations, and governance practices. ● Exposure to streaming data architectures and real-time processing. Nice to Have: ● Knowledge of data cataloging tools and metadata management. ● Familiarity with BI tools like Tableau, Power BI, or Looker. ● Experience in working within political, social sector, or campaign data environments. ● Prior team lead experience in an agile environment. Soft Skills: ● Strong problem-solving and decision-making capabilities. ● Excellent communication and stakeholder management skills. ● Ability to work in a fast-paced, mission-driven environment. ● Ownership mindset and ability to manage multiple projects with minimal supervision. Show more Show less

Posted 1 week ago

Apply

2.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Heaps is one of the fastest growing health tech start-ups in India today. With more than 5 million patient interactions till date, Heaps is revolutionising how insurers, doctors and patients experience health care. We are building an AI driven end-to-end care management platform which coordinates, tracks and monitors the health of the patient empowering them with the right information to take right decisions at the right time about their health. Heaps is backed by marquee investors - titans in the healthcare industry who have invested more than $7.4 million in Series A funding. Heaps is on a path for global scaling, we’ve already expanded in 5 countries across the globe. Are you a data scientist with a knack for problem solving and crisp communication? Data science team at Heaps is chartered to drive analytics/machine learning programs across the organisation. Heaps is looking for experienced Data Scientists who will be at the forefront of developing products/solutions for insurance/hospital clients across the globe. You will be responsible to drive use cases and analytics assignments to help the clients uncover actionable insights from the data which will lead to improved care for the patients. Responsibilities Understand business problems and work on the statistical and analytical approach required for the solution Work on various steps of a data science project - Data mining and Exploratory Data Analysis (EDA), Feature Engineering, Statistical modelling, integration with AI/ML platform, visualisation and result inferencing and presentation to stakeholders Leverage machine learning techniques such as regression, classification, clustering, bayesian algorithms, matrix factorization, graphical models, to contribute to the execution of our vision for ML-based technology solutions Leverage advanced NLP algorithms/architectures to build custom models for entity extraction and concept recognition, relation extraction, summarization, textual classification and clustering, etc. Collaborate with business stakeholders to effectively integrate and communicate analysis findings Qualifications 2-3 years of prior data science experience Data science programming languages and querying databases: Python, SQL/HIVE Deep understanding of various statistical/ML techniques Expertise in the use of cloud based infrastructure to manage the volume and veracity of complex data streams Basic knowledge of API frameworks like Flask/Fastapi, version control systems like Git and cloud platform like AWS/GCP Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Responsibilities: Implement data pipelines that meet design and are efficient, scalable, and maintainable Implement best practices including proper use of source control, participation in code reviews, data validation and testing Timely deliveries while working on projects Act as advisor/mentor and helps junior data engineers in their deliverables Must Have Skills: Should have experience of at least 4+ years with Data Engineering Strong experience of design, implementation and fine-tuning big data processing pipelines in production environment Experience with big tools like Hadoop, Spark, Kafka, Hive, Databricks Experience in programming at least one of with Python, Java, Scala, Shell Script Experience with relational SQL and NO SQL databases like PostgresSQL, MYSQL, Cassandra etc. Experience with any data visualization tool (Plotly, Tableau, Power BI, Google Data Studio, Quick sight etc.) Good To Have Skills: Should have Basic Knowledge of CI/CD Pipeline Experience in working on at least one Cloud (AWS or Azure or GCP) For AWS: - Experience with AWS Cloud services like EC2, S3, EMR, RDS, Athena, Glue, Lambda, EMR For Azure: -Experience with Azure Cloud services like Azure Blob/Data Lake GEN2, Delta Lake, Databricks, Azure SQL, Azure DevOps, Azure Data Factory, Power BI For GCP: - Experience with GCP Cloud services Big Query, Cloud Storage bucket, DataProc, Dataflow, Pub Sub, Cloud Function, Data Studio Sound familiarity in Versioning tools (Git, SVN etc.) Experience Mentoring students is desirable Knowledge of latest developments in Machine Learning, Deep Learning, Optimization in Automotive domain. Open minded approach to explore multiple algorithms to design optimal solution. History of contribution to articles/blogs/whitepapers etc. in Analytics History of contribution to Open Source. Required Skills Data Engineering,Hadoop,Kafka,CI/CD,Cloud Supported Skills Show more Show less

Posted 1 week ago

Apply

1.0 - 6.0 years

2 - 7 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

Position Name Data Engineer Total Exp: 3-5 Years Notice Period: Immedidate joiner Work Location: Mumbai, Kandivali Work Type: Work from Office Job Description Must have: Must have Data Engineer having 3 to 5 years of experience Must have Should be an individual contributor to deliver the feature/story within given time and expected quality Must have Should be good in Agile process Must have Should be strong in Programming and SQL queries Must have Should be capable to learn new tools and technologies to scale on Data engineering Must have Should have good communication and client interaction. Technical Skills: Must have Data Engineering using Java/Python, Spark/Py-Spark, Big Data(Hadoop, Hive, Yarn, Oozie, etc.,), Cloud warehouse(Snowflake), Cloud services(AWS EMR, S3, Lambda, RDS/Aurora) Must have Unit testing Framework Junit/Mokito/PowerMock Must have Strong experience on SQL queries(MySQL/SQL server/Oracle/Hadoop/snowflake) Must have Source safe - GITHub Must have Project management tool - VSTS Must have Build management tool - Maven / Gradle Must have CI/CD Azure devops Added advantage: Good to have Shell script, Linux commands

Posted 1 week ago

Apply

2.0 - 3.0 years

0 Lacs

Surat, Gujarat, India

On-site

Linkedin logo

Job Title: Mobile App Developer - Flutter Location: Surat, India Experience Level: 2-3 Years Job Type: Full-Time About Us: Soulera provides AI-driven handwriting analysis and Chaldean numerology. We serve educational institutions (B2B) and individuals (B2C). Our platform, Soulera, integrates traditional sciences with AI to deliver precise, insightful analyses. Position Overview: We're seeking a Mobile App Developer with 2–3 years of experience to build and maintain high-quality mobile applications on both Android and iOS platforms using Flutter . The ideal candidate is comfortable with Dart, cross-platform design, and has experience working with mobile services like Firebase, OneSignal, and Sentry. Responsibilities: Develop, test, and maintain cross-platform mobile applications using Flutter and Dart. Build intuitive, responsive UIs using responsive_framework Follow the MVC architecture pattern and manage app state and routing with GetX Integrate and manage GraphQL APIs using graphql_flutter and REST APIs using Dio and Http Implement secure local storage using Hive Handle JSON serialisation using native Dart/web JSON (no dependencies) Apply custom form validation using Regex for all user inputs Integrate services like Firebase (for crash reporting), Sentry , and OneSignal (for push notifications) Work with custom Figma-based icon sets and implement theming using fonts like Figtree , RobotoMono , Cormorant Garamond , and Redacted Script Collaborate with backend, product, and design teams to ship pixel-perfect mobile features Debug, monitor, and improve mobile performance and crash logs proactively Qualifications: 2–3 years of experience building mobile apps with Flutter and Dart Strong knowledge of Android Studio and Xcode. Experience deploying apps for Android and iOS. Comfortable using CocoaPods and managing native dependencies Solid understanding of GetX for both state management and navigation Experience with secure storage, push notifications, crash analytics, and payment flows Familiarity with custom UI/UX implementations as per Figma designs Strong debugging skills and performance optimization mindset Proficient in using Firebase , Sentry , OneSignal , and Razorpay. Bonus Skills: Experience with writing unit and widget tests in Flutter Prior experience deploying apps to Play Store and App Store Understanding of accessibility and offline-first mobile app practices Experience with CI/CD for mobile app delivery. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Position : Lead Data Engineer Experience : 7+ Years Location : Remote Summary We are looking for a Lead Data Engineer responsible for ETL processes and documentation in building scalable data warehouses and analytics capabilities. This role involves maintaining existing systems, developing new features, and implementing performance improvements. Key Responsibilities Build ETL pipelines using Fivetran and dbt for internal and client projects across platforms like Azure , Salesforce , and AWS . Monitor active production ETL jobs. Create and maintain data lineage documentation to ensure complete system traceability. Develop design/mapping documents for clear and testable development, QA, and UAT. Evaluate and implement new data integration tools based on current and future requirements. Identify and eliminate process redundancies to streamline data operations. Work with the Data Quality Analyst to implement validation checks across ETL jobs. Design and implement large-scale data warehouses , BI solutions, and Master Data Management (MDM) systems, including Data Lakes/Data Vaults . Required Skills & Qualifications Bachelor's degree in Computer Science, Software Engineering, Math, or a related field. 6+ years of experience in data engineering, business analytics, or software development. 5+ years of experience with strong SQL development skills . Hands-on experience in Snowflake and Azure Data Factory (ADF) . Proficient in ETL toolsets such as Informatica , Talend , dbt , and ADF . Experience with PHI/PII data and working in the healthcare domain is preferred. Strong analytical and critical thinking skills. Excellent written and verbal communication. Ability to manage time and prioritize tasks effectively. Familiarity with scripting and open-source platforms (e.g., Python, Java, Linux, Apache, Chef ). Experience with BI tools like Power BI , Tableau , or Cognos . Exposure to Big Data technologies : Snowflake (Snowpark) , Apache Spark , Hadoop , Hive , Sqoop , Pig , Flume , HBase , MapReduce . Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Vadodara, Gujarat, India

Remote

Linkedin logo

Job Title: Data Scientist Location: Vadodara preferred or Remote Experience Level: Entry to Mid-level (0–5 years) Employment Type: Full-Time Job Summary: We are seeking a skilled and driven Data Scientist with experience in Big Data, Large Language Models (LLMs), data organization, data analytics and advanced Excel. The ideal candidate will have a strong analytical background, a passion for uncovering insights from complex datasets and the ability to communicate data-driven findings to technical and non-technical stakeholders. You will be instrumental in designing and implementing data solutions that empower business decision-making and innovation. Key Responsibilities: · Collect, clean and organize large and complex datasets from multiple sources. · Develop and deploy predictive models and machine learning algorithms, including applications involving LLMs (e.g., GPT, BERT). · Analyse structured and unstructured data to generate actionable insights for business strategies. · Collaborate with cross-functional teams to identify data-related opportunities and deliver data-driven solutions. · Build scalable data pipelines and contribute to data architecture best practices. · Design and maintain advanced Excel dashboards, models and reports to support various departments. · Apply statistical and data mining techniques to interpret and visualize trends, patterns and correlations. · Present findings clearly through reports, visualizations and presentations tailored to different audiences. · Stay current on emerging data technologies, tools and methodologies. Required Qualifications: · Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics or a related field. · 0 to 5 years of experience in a data science or analytics role. · Advanced skills in Microsoft Excel, including Power Query, pivot tables, VBA/macros and complex formulas. · Solid understanding of relational databases and SQL. · Strong communication and problem-solving skills. Preferred Qualifications: · Hands-on experience with Big Data technologies (e.g., Hadoop, Spark, Hive). · Experience with cloud platforms (e.g., AWS, Azure, GCP) for data processing and model deployment. · Working knowledge or experience with Large Language Models (LLMs) and NLP techniques. · Proficiency in Python, R, or other data science programming languages. · Expertise in data analytics, data visualization (e.g., Tableau, Power BI, matplotlib) and data wrangling. · Familiarity with MLOps and model lifecycle management. · Knowledge of version control tools such as Git. · Experience with APIs and integrating external data sources. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Sr. Data Engineer Location: Pune (Hybrid – 3 Days WFO) Shift: General shift (Asia/Europe overlap); occasional extended hours for audits/releases Notice Period: Immediate to 30 days preferred Interview Rounds: 2 rounds (max 3 if needed) Core Technical Requirements MSBI Stack (Mandatory) SSIS, SSAS, SSRS, Power BI Strong DAX and MDX skills Database Proficiency Expert in SQL Server and Oracle ( optimized query writing ) Big Data Tools (Nice-to-Have) Basic to intermediate knowledge of Hadoop, Spark, Hive, Impala Willingness to learn and adapt ETL & Data Pipelines Building and deploying scalable data and analytics solutions Handling petabytes of data ITSM & Agile Familiarity with BMC Remedy (incident/change/problem management) Understanding of Agile ceremonies (Scrum, backlog grooming, story points) Additional Skills & Preferences SFTP: Understanding of secure file transfer protocols Support: L1/L2 support expected (approx. 50% of workload) Cloud: No current cloud usage; future potential for Azure/AWS Domain: Payments domain is a plus, not mandatory Certifications: Not required, but Power BI certification is a bonus Soft Skills Strong communication (written and verbal) Adaptability to handle ad hoc requests and support Team collaboration and understanding of Agile workflows Value Proposition Exposure to petabyte-scale data and on-prem big data platforms Opportunity to work with global clients and regulatory bodies Long-term potential to transition into cloud-based data engineering Hands-on experience with end-to-end data product development If you are interested, please send your availability, current and expected CTC and your latest resume to jeff@priglobal.com. Thanks Jeff Mislang Delivery Lead PRI India IT Services Pvt Ltd 20 LIG, Dharma Reddy Colony Phase I, Kukatpally, Hyderabad, Telangana 500085 Email: jeff@priglobal.com Show more Show less

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Impetus Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Experience- 3-8 years Location- Gurgaon & Bangalore Job Description You should have extensive production experience in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles & Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Show more Show less

Posted 1 week ago

Apply

3.5 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Impetus Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Job Role- Data Scientist Experience- 3.5+ years Locations- Indore, Bangalore, Pune, Gurgaon, Noida Job Description Hands on experience of working with LLM (Large Language Models) or LangChain and OpenAI in particular. Implementing and fine tuning the AI- generated text prompts using LLMs (eg:-GPT4) Skilled in AI-specific utilities like ChatGPT, Hugging Face Transformers, etc. Ability to understand business requirements. Usecase derivation and solution creation from structured/unstructured data Story telling, Business communication and Documentation Programming Skills – Python, Scikit-Learn, TensorFlow, PyTorch, Keras Exploratory Data Analysis Machine Learning and Deep Learning Algorithms Model building, Hyperparameter tuning and Model performance metrics MLOps, Data Pipeline, Data engineering Statistics Knowledge (Probability Distributions, Hypothesis Testing) Time series modeling, Forecasting, Image/Video Analytics, Natural Language Processing (NLP). ML services from Clouds such as AWS, GCP, Azure and Databricks Optional - Big Data -Basic knowledge on Spark, Hive Roles & Responsibilities Acquire skills required for building Machine learning models and deploy them for production. Feature Engineering, EDA, Pipeline creation, Model training and hyperparameter tuning with structured and unstructured data sets. Develop Cloud based applications including LLM/GenAI and deploy them into production. Qualification Degree – Graduates/Postgraduate in CSE/IT or related field Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Nielsen, a global company specializing in audience measurement and analytics, is currently seeking a proficient leader in data engineering to join their team in Bangalore, or Gurgaon. In this role, you will be managing a scrum team and overseeing an advanced data platform that analyzes audience consumption patterns across various channels like OTT, TV, Radio, and Social Media worldwide. You will be responsible for building and supervising a top-performing data engineering team that delivers data for targeted campaigns. Moreover, you will work with AWS services (S3, Lambda, Kinesis) and other data engineering technologies such as Spark, Scala/Python, Kafka, etc. There may also be opportunities to establish deep integrations with OTT platforms like Netflix, Prime Video, and others. Furthermore, you will be accountable for ensuring engineering and operational excellence, as well as fostering a culture of innovation and experimentation within the team. Developing leaders within the team is another crucial aspect of this role. About the role This position is responsible for participating as a team lead / developer in analyzing and designing highly-complex or business-critical applications, as well as developing, testing, and supporting application software. Responsibilities Oversee the development of scalable, reliable, and cost-effective software solutions with an emphasis on quality, best-practice coding standards, and cost-effectiveness Participate as a team lead on projects, which includes training, coaching, and sharing technical knowledge with less experienced staff. Rapidly identify and resolve technical incidents as they emerge Build rapid technical prototypes for early customer validation of new technologies Collaborate effectively with Data Science to understand, translate, and integrate methodologies into engineering build pipelines Collaborate with product owners to translate complex business requirements into technical solutions, providing leadership in the design and architecture processes. Provide expert apprenticeship to project teams on technology strategy, cultivating advanced skill sets in application engineering and implementing modern software engineering practices Lead and mentor a team of Software Developers and Senior Software Developers, providing guidance and support in their professional development Stay informed about the latest technology and methodology by participating in industry forums, having an active peer network, and engaging actively with customers Cultivate a team environment focused on continuous learning, where innovative technologies are developed and refined through collaborative effort Key Skills Domain Expertise Bachelor’s degree in computer science, engineering plus 6-8 years of experience in information technology solutions development and 2-3 years managing teams. Proven experience in leading and managing software development teams. Must have strong cloud Implementation expertise in cloud architecture. Must have the ability to provide solutions utilizing best practices for resilience, scalability, cloud optimization and security. Basic project management skills. Technical Skills 6+ years of experience: big data using Apache Spark in developing distributed processing. applications; building applications with immutable infrastructure in the AWS Cloud with automation technologies like Terraform or Ansible or CloudFormation. Experience in Service-oriented architecture, Spark Streaming, and Git. Experience in software development using programming languages & tools/services like: Java or Scala, Big Data, Hadoop, Spark, Spark SQL, Presto \ Hive,Cloud (preferably AWS), Docker, RDBMS (such as Postgres and/or Oracle), Linux, Shell scripting, GitLab, Airflow. Experience in big data processing tools/languages using Apache Spark Scala. Experience with orchestration tools: Apache Airflow or similar tools. Strong knowledge on Unix/Linux OS, commands, shell scripting, python, JSON, YAML. Agile scrum experience in application development is required. Strong knowledge in AWS S3, PostgreSQL or MySQL. Strong knowledge in AWS Compute: EC2, EMR, AWS Lambda. Strong knowledge in Gitlab /Bitbucket . AWS Certification is a plus. Mindset and attributes Exceptional verbal/written communication and interpersonal skills. Strong leadership qualities and the ability to inspire and motivate a team. Strong ability to translate business requirements into technical solutions and guide the team in execution. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Position Summary... Demonstrates up-to-date expertise and applies this to the development, execution, and improvement of action plans by providing expert advice and guidance to others in the application of information and best practices; supporting and aligning efforts to meet customer and business needs; and building commitment for perspectives and rationales. Provides and supports the implementation of business solutions by building relationships and partnerships with key stakeholders; identifying business needs; determining and carrying out necessary processes and practices; monitoring progress and results; recognizing and capitalizing on improvement opportunities; and adapting to competing demands, organizational changes, and new responsibilities. Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity by incorporating these into the development and implementation of business plans; using the Open Door Policy; and demonstrating and assisting others with how to apply these in executing business processes and practices. What you'll do... About The Team Our team is responsible for design, development, and operations of Walmart Fulfillment System (WFS), designed to attract new marketplace sellers to increase assortment, speed up delivery, and monetize out Omnichannel Supply Chain assets. This position is focused more on how to bring in Operational Excellence and look at opportunities to bring in improvements in the current WFS eco-system by building new Tools ; Capabilities as well as look for Automation opportunities. This will also include ensuring that every feature release of WFS has adequate Alerts ; Monitoring and Observability Dashboards setup. What You’ll Do: Through this role you have an opportunity to develop software and work on Automation that meets and exceeds the needs of the Sellers, Internal Teams and the Company Contribute to and review technical designs of software solutions Contribute to and follow development best practices such as version control, unit testing, continuous integration, performance and security testing, and appropriate documentation You will show your skills in analyzing and testing programs/products before formal launch to ensure flawless performance Software security is of prime importance and by developing programs that monitor sharing of private information, you will be able to add tremendous credibility to your work Collaborate with developers to implement solutions, resolve problems and perform code reviews You also get to collaborate with team members to develop best practices and client requirements for the software. Works with stakeholders following the Agile Scrums of development process You will also be required to seek ways to improve the software and its effectiveness You will be called upon to support the coaching and training of other team members to ensure all employees are confident in the use of software applications What You’ll bring: Bachelors degree in Computer Science or related technical field 6 to 10 years of experience in developing Enterprise applications using Java, Spring-boot, REST, Kafka 2+ years of experience in Distributed systems and large scale application development and design Extensive hands-on experience building microservices using java Working knowledge on SQL/No-SQL database technologies such as Oracle, Cassandra, Hive and Caching solutions (Couchbase, Memcached) Exposure to Cloud (AWS/Azure) ; Kubernetes Working with continuous integration and related tools like Jenkins, Maven Experience in messaging system like MQ or Kafka or Azure EventHub Experience in production system operations (logging, telemetry, alerting etc.) Experience working on Grafana, Prometheus, Splunk, XMatters or similar tools Additional Qualifications: Large scale distributed systems experience, including scalability and fault tolerance Scripting knowledge that are executed daily to refresh data feeds from multiple system A continuous drive to explore, improve, enhance, automate and optimize systems and tools Strong computer science fundamentals in data structures and algorithms Must be able to work effectively in a team setting as well as individually Ability to communicate and collaborate with external teams and stakeholders Excellent oral and written communication skills About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 3years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 5 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, GISF, CISSP, CCSP, or GSEC, Master’s degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 1 year’s experience leading information security or cybersecurity projects Information Technology - CISCO Certification - Certification Primary Location... 4,5,6, 7 Floor, Building 10, Sez, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-2147338 Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Good-day, We have immediate opportunity for GCP Data Engineer role. Job Role: GCP Data Engineer Job Location: Gurugram Experience : 5 to 9 years Notice Period : Immediate joiners are preferred About Company: At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron’s progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 24+ years, our company has been honoured with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 13,900+, and has 52 offices in 22 countries within key global markets. For more information on the company, please visit our website or LinkedIn community. Diversity, Equity, and Inclusion Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and an affirmative-action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. Job Description : GCP Data Engineer We are on the lookout for a GCP Data Engineer with a strong background in hands-on experience using GCP,Python,Spark,Pyspark,Hive. If you are a seasoned professional with a passion for crafting high-quality Data Engineer activities and a knack for configuration, we want to hear from you. Software Requirements: Data Engineer GCP Overall Responsibilities: 5+ years of hands-on experience in Data Engineering. Should have fundamental understanding and development experience in Hive/SPARK/Python. Familiar with GCP offerings, experience building data pipelines on GCP Should have good communication skill, being proactive. Skills: Data Engineer GCP Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Ability to take initiative, prioritize tasks, and manage time effectively. Experience: Minimum of 5+ years of experience in software development, with a focus on other technologies. Experience with software development methodologies and tools such as Agile, Scrum, Git, JIRA, and Confluence. Experience working with cross-functional teams and participating in code reviews. If you find this this opportunity interesting kindly share your updated profile on Muhammed.shan@synechron.com With below details (Mandatory) Total Experience - Current CTC- Expected CTC- Notice period- Current Location- If you had gone through any interviews in Synechron before? If Yes when Regards, Muhammed Shan Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Summary This role could be based in India and Malaysia. When you start the application process you will be presented with a drop down menu showing all countries, Please ensure that you select a country where the role is based. RESPONSIBILITIES Lead the implementation and advocacy for SRE (SIte Reliablity Engineer) principles to improve the reliability and availability of our applications Drive work on setting and maintaining SLI/SLO/Error budgets for our applications Responsible for developing and executing on the Chapter Vision together with the other Chapter Leads Drive technology strategy, technology stack selection, and implementation for a future-ready technology stack, to achieve outcomes of highly scalable, robust, resilient system. Experienced former practitioner with leadership ability. Oversees the execution of functional standards and best practices Provide thought leadership on the craft, inspire and retain talents by developing and nurturing an extensive internal and external network of practitioners. This role is around capability building, it is not to own applications or delivery Creates a strategy roadmap of technical work Works to drive technology convergence and simplification across their chapter area Technical Responsibilities Service Reliability: Monitor and maintain the reliability, availability, and performance of production services and infrastructure. Automation and Tooling: Develop and maintain automation tools and processes to streamline system provisioning, configuration management, deployment, and monitoring. Incident Management: Respond to and troubleshoot incidents, outages, and performance issues in production environments, ensuring timely resolution and minimal impact on users. Blameless Postmortems and Learning from Incidents – Participate in the wider root cause analysis and support & drive collaborative actions. Capacity Planning: Analyze system performance and capacity trends to forecast future resource requirements and optimize infrastructure utilization. Performance Optimization: Identify and address performance bottlenecks and optimization opportunities across the software stack, from application code to underlying infrastructure. Security and Compliance: Implement security best practices and ensure compliance with regulatory requirements, collaborating with security and compliance teams as needed. Continuous Improvement: Continuously evaluate and improve system reliability, scalability, and performance through automation, process refinement, and technology upgrades. Documentation and Knowledge Sharing: Document system designs, configurations, and procedures, and share knowledge with team members through documentation, training, and mentoring. Strategy Reliability Engineering Strategy – Develop and execute a comprehensive reliability engineering strategy to ensure high availability, fault tolerance and disaster recovery capabilities for critical systems and services Scalability Planning – Design and implement scalable architecture solution that can accommodate growth in user traffic and data volume over time Monitory and Alerting Strategy – Defining and implementing monitoring and alerting strategies to proactively identify and address issues before they reach the end users Capacity Planning Strategies – Develop capacity planning strategies to ensure that systems have sufficient resources to handle current and future workloads Business Experienced practitioner and hands on contribution to the squad delivery for their craft (Eg. SRE). Responsible for balancing skills and capabilities across teams (squads) and hives in partnership with the Chief Product Owner & Hive Leadership, and in alignment with the fixed capacity model. Responsible to evolve the craft towards improving automation, simplification and innovative use of latest market trends. Trusted advisor to the business. Work hand in hand with the Business, taking product programs from investment decisions, into design, specification, and solution phases, all the way to operations on the ground and securing support services from other teams. Provide leadership and technical expertise for the subdomain to achieve goals and outcomes Support respective businesses in the commercialisation of capabilities, bid teams, monitoring of usage, improving client experience, and collecting defects for future improvements. Manage business partner expectations. Ensure delivery to business meeting time, cost and with high quality Processes Chapter Lead may vary based upon the specific chapter domain its leading. Define standards to ensure that applications are designed with scale, resilience and performance in mind Enforce and streamline sound development practices and establish and maintain effective governance processes including training, advice and support, to assure the platforms are developed, implemented and maintained aligning with the Group’s standards Responsible for overall governance of the subdomain that includes financial management, risk management, representation in steering committee reviews and engagement with business for strategy, change management and timely course correction as required Ensure compliance to the highest standards of business conduct, regulatory requirements and practices defined by internal and external requirements. This includes compliance with local banking laws and anti-money laundering stipulations People & Talent Accountable for people management and capability development of their Chapter members. Reviews metrics on capabilities and performance across their area, has improvement backlog for their Chapters and drives continual improvement of their chapter. Focuses on the development of people and capabilities as the highest priority. Ensure that the organisation works in a proactive way to upgrade capacity well in advance and predict future capacity needs Responsible for building an engineering culture where application and infrastructure scalability is paramount for on-going capacity management with an aim to reduce the need for capacity reviews using monitoring and auto-scale properties Empower the engineers so that they can provide economy of scale focused on delivering value, speed to market, availability, monitoring & system management Foster a culture of innovation, transparency, and accountability end to end in the subdomain while promoting a “business-first” mentality at all levels Develop and maintain a plan that provides for succession and continuity in the most critical delivery and management position Risk Management Responsible for effective capacity risk management across the Chapter with regards to attrition and leave plans. Ensures the chapter follows the standards with respect to risk management as applicable to their chapter domain. Adheres to common practices to mitigate risk in their respective domain. Effectively and collaboratively identify, escalate, mitigate, and resolve risk, conduct and compliance matters. Incident Response Planning – Develop incident response plans and procedures to effectively mitigate and manage risks when they materialize Risk monitoring and alerting – Implement monitoring and alerting systems to detect early warning signs of potential risks Root Cause analysis – Conduct thorough root cause analysis of incidents and outages to understand the underlying causes and contributing factors Ensure that the organisation works in a proactive way to upgrade capacity well in advance and predict future capacity needs Responsible for building an engineering culture where application and infrastructure scalability is paramount for on-going capacity management with an aim to reduce the need for capacity reviews using monitoring and auto-scale properties Empower the engineers so that they can provide economy of scale focused on delivering value, speed to market, availability, monitoring & system management Regulatory & Governance Ensure all artefacts and assurance deliverables are as per the required standards and policies (e.g., SCB Governance Standards, ESDLC etc.). Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Key Stakeholders Chief Product Owner, Hive Lead, Product Owners, Engineering Leads WRB Application Teams Other Responsibilities Embed Here for Good and Group’s brand and values in the digital sales/commerce team Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures Qualification Requirements & Skills Bachelor's degree in computer science, Information Technology, or related field (or equivalent experience). Overall experience of 15+ years Proven experience of at least 10+ years as an SRE Engineer or in a similar role, with a proven track record of leadership. Strong understanding of SRE principles and practices. Proficiency in troubleshooting complex issues and exceptional problem-solving skills. Deep knowledge of a wide array of software applications and infrastructure. Experience with monitoring and observability tools (e.g., Prometheus, Grafana, AppDynamics, Splunk, PagerDuty). Proficiency in scripting and automation (e.g., Python, Bash, Ansible). Familiarity with cloud platforms (e.g., AWS, Azure) and containerization technologies (e.g., Docker, Kubernetes). Excellent communication and collaboration skills. Ability to work in a fast-paced, dynamic environment. Strong attention to detail and a commitment to delivering high-quality results. Ability to debug and troubleshoot Java applications. Proficiency in using Splunk for log management and analysis. Familiarity with CI/CD tools and practices. Experience in the banking or financial services industry. Certification in relevant technologies (e.g., AWS Certified Solutions Architect, Google Cloud Professional DevOps Engineer). Knowledge of security best practices and compliance requirements. Ability to articulate the overall vision for the Chapters and ensure upskilling of the organisation holistically Experience in identifying skill gaps and mitigate risks to deliverables Ensure all solutions are as per Architecture Standards Strong experience in software development, system administration, or a related technical field. Proficiency in programming/scripting languages such as Python, Go, Java, or Shell scripting. Experience with containerization and orchestration technologies such as Docker, Kubernetes, or similar. Deep understanding of Linux/Unix systems and networking fundamentals. Experience with cloud platforms such as AWS, GCP, or Azure. Strong analytical and problem-solving skills, with a keen attention to detail. Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. Prior experience with DevOps practices, continuous integration/continuous delivery (CI/CD) pipelines, and infrastructure as code (IaC) is a plus. Role Specific Technical Competencies Software Engineering Systems Software Infrastructure Platform Architecture Programming & Scripting (Java / Python or Similar Programming Language) Cloud (AWS, Azure, GCP) Database Development Service Excellence Agile Application Delivery Process Operating Systems Network Fundamentals Security Fundamentals Credit Card and Lending Domain Knowledge About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Profile Description Standard Chartered Bank Show more Show less

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Lowe’s Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About The Team The Lowe’s One Roof Media Network Technology team delivers low latency ad-tech solution to our LORMN client partners. The team delivers high quality and uses cutting edge technology. Job Summary The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver modules, stable application systems, and Data or Platform solutions. This includes developing, configuring, or modifying integrated business and/or enterprise infrastructure or application solutions within various computing environments. This role facilitates the implementation and maintenance of business and enterprise Data or Platform solutions to ensure successful deployment of released applications. Roles & Responsibilities Core Responsibilities: Helps develop integrated business and/or enterprise application solutions in data analytical space to ensure specifications are flexible, scalable, and maintainable and meet architectural standards. With help from more senior engineers, develops software/data solutions for business requirements using a basic understanding of programming fundamentals. Ensures basic unit testing and functional testing coverage accounting for all boundary conditions according to the system integration test plan. Follows best source control and continuous integration/continuous deployment practices for efficient testing and deployment of code to different environments as defined for the team. Reviews technical documents, design, code, and demonstrations to learn from more senior engineers and stay aligned in team approach. Analyzes and organizes data to help deliver insights requested by the business. Helps develop, maintain, and enhance operational, analytical (including self-serve) applications across various business domains; delivers reports on-premises and cloud infrastructure; uses frameworks and reusable components whenever possible. Troubleshoots system issues, helps in root cause analysis, and ensures conformance of the technology solutions with IT governance and regulatory frameworks. Helps implement infrastructure-related projects for the organization. Years Of Experience 1 - 2 years of experience in data engineering Education Qualification & Certifications (optional) Required Minimum Qualifications Bachelor's degree in engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field) Skill Set Required Good proficiency and experience with the following SQL Python Hadoop or Cloud Spark or Pyspark Hive Oozie\airflow CI\CD GIT Secondary Skills (desired) Preferrable experience in the following: Airflow GCP cloud experience Big Query Trino Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law. Starting rate of pay may vary based on factors including, but not limited to, position offered, location, education, training, and/or experience. For information regarding our benefit programs and eligibility, please visit https://talent.lowes.com/us/en/benefits. Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Job Title: Technical Director Location: Mumbai Experience: 12+ years Job Summary We are seeking a highly experienced and visionary Senior Data Architect to lead the design and implementation of scalable, secure, and high-performance data platforms across cloud and hybrid environments. The ideal candidate will bring deep expertise in data engineering, cloud architecture, and modern data paradigms such as Data Mesh and Lakehouse, with a proven track record of delivering enterprise-grade solutions. Key Responsibilities Lead the architecture, design, and implementation of data platforms including Data Lakes, Data Warehouses, and Lakehouse’s on AWS, Azure, and GCP. Define and implement data strategies, governance, and best practices for data ingestion, transformation, and consumption. Collaborate with cross-functional teams including business stakeholders, data scientists, and engineering teams to deliver robust data solutions. Provide technical leadership in pre-sales engagements, RFP responses, and solutioning for new business opportunities. Mentor and guide data engineering teams, fostering a culture of innovation and continuous learning. Drive the adoption of modern data architecture principles such as Data Mesh and real-time streaming. Evaluate and recommend tools, frameworks, and platforms to enhance data capabilities. Required Skills & Qualifications 15+ years of experience in data engineering and architecture roles. Strong hands-on experience with cloud platforms: AWS, Azure, GCP. Expertise in tools and technologies such as Snowflake, Databricks, ElasticSearch, Kafka, Informatica, Pentaho, Apache Spark, Hive. Proficiency in Python, SQL, PL/SQL, and real-time data processing (CDC, Debezium, Kafka). Deep understanding of Data Lake, Data Warehouse, Data Mesh, and Lakehouse architectures. Experience in leading large-scale data migration and modernization projects. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Qualifications Any engineering or post graduation Show more Show less

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Let’s do this. Let’s change the world. In this vital role you will collaborate with business partners, service owners and IS peers to develop predictive models and insights across the US Commercial Organization. This position will innovate and build significant business impact through the use of sophisticated analytics techniques to help Amgen with its mission to serve patients by helping them get the therapies they need. Flexible Commuter role to Amgen India office. You will work on-site 2-3 days a week. This position will be primarily responsible for: Working collaboratively with cross-functional teams on projects and/or programs with aims to systematically derive insights that ultimately derive substantial business value for Amgen and our patients Identifying business needs and proposing potential analytics approaches for solutions Crafting and deploying a framework to supervise the performance of various campaigns, and tactics at a granular level Leading measurement and tracking of various omnichannel CX enablement initiatives Supporting the development of data science, machine learning prototypes, proof of concepts and models for testing various omnichannel strategies Communicating analysis ideas, progress and results to leadership and business partners What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Doctorate degree ORdata science and/or analytics experience Master’s degree and 4 to 6 years of data science and/or analytics experience OR Bachelor’s degree and 6 to 8 years of data science and/or analytics experience OR Diploma and 10 to 12 years of data science and/or analytics experience Preferred Qualifications: Relevant work experience in campaign measurement, marketing analytics and resource optimization in the pharma domain Programming experience with Python, R, or SAS and experience with ML libraries like scikit-learn, MLib, or TensorFlow Experience working with large datasets, experience working with distributed computing tools (Spark, Hive, etc.) is a plus Ability to communicate analysis in a clear, detailed, and practical manner Passion for learning and staying on top of current developments in sophisticated analytics Biotech / Pharma experience What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com

Posted 1 week ago

Apply

6.0 - 10.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Data Analytics Manager What you will do Let’s do this. Let’s change the world. This role will be the Strategic Insights + Analytics (SIA) Team’s resident subject matter expert on reporting design, meaningful metrics, storytelling through reporting and building reports optimized to meet stakeholders’ needs. This person will have expert hands-on Tableau, Power BI, and ETL development skills – including, but not limited to, the ability to quickly build minimum viable product dashboards as required by Amgen leaders and key stakeholders. Finally, this role will also provide consultation support to other reporting + analytics developers in Amgen’s CFO organization. Primary Responsibilities: Provide actionable, expert guidance to the SIA and FIT Reporting + Analytics teams regarding reporting design and development Personally develop key reporting and analytics in Tableau or Power BI in response to critical, just-in-time CFO organization requests Progress quickly developed reports to polished, automated, future-proof end states Help design and implement a SIA/FIT reporting + analytics strategy, which could include but isn’t limited to a full scale reporting migration from the Tableau application to Power BI Stay current with latest reporting + analytics trends and technology and make reporting strategy recommendations as needed to SIA/FIT leadership Collaboration: Partner with both US and India-based SIA and FIT colleagues to achieve shared objectives Report directly to the hiring a senior manager based in Thousand Oaks, California. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Required Skills + Qualifications: Expert skill at reporting design and defining meaningful metrics Expert proficiency at Tableau and Power BI development Advanced “business analyst” skill at grasping and translating business requirements into technical requirements Clear, concise verbal and written communication Development experience with cloud storage and ETL tools such as Databricks and Prophecy Solid understanding of finance concepts, financial statements and financial data Skill in managing large and complex datasets Additional Preferred Experience: Familiarity with Oracle Hyperion, Anaplan, SAP S/4 Hana, Workday and JIRA Ability to work collaboratively with teams and stakeholders outside of SIA/FIT, including cross-functionally Education/ Prior Employment Qualifications: Master’s degree and 5 years of finance or analytics development experience Bachelor’s degree and 8 years of finance or analytics development experience What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Overview The individual will spend time building and maintaining our in-house planogram platform and leverage analytical and critical reasoning to solve complex, multidimensional problems using quantitative information and applying statistical and machine learning techniques. The C# .Net Developer will work with team members to develop the software that will implement our product assortment and placement onto PepsiCos planogram platform. Responsibilities Expand and maintain in-house planogram/reporting platform built using C# .NET framework Work with team lead on enhancing platform Optimize shelf assortment across multiple categories that satisfy days of supply, blocking and flow constraints Expand platform to new categories Apply machine learning techniques into assortment optimization and product placement Enhance and maintain platform UI Qualifications B.S./M.S. in quantitative discipline required (e.g. computer science, mathematics, operations research, engineering.) 7+ years of coding experience in C#, specifically using the .net framework 3+ years of coding experience in Angular or any equivalent js framework Strong skills in C# using ASP.NET and .net Core frameworks, LINQ and Entity Framework Experience with using project management tools such as DevOps Ability to support/develop windows and web platform simultaneously. High-level querying skills using SQL languages such asSQL or Presto. Knowledge using window functions, joins and sub-queries SQL SERVER or any RDBMS. Experience with Azure, .Net Core, Visual Studio, SQL Server

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

DATA SCIENCE + GEN AI Major Duties & Responsibilities: Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions. Create Proof of Concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects. Influence machine learning strategy for Digital programs and projects. Make solution recommendations that appropriately balance speed to market and analytical soundness. Explore design options to assess efficiency and impact, and develop approaches to improve robustness and rigor. Develop analytical/modeling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow). Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations. Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories. Create algorithms to extract information from large, multiparametric data sets. Deploy algorithms to production to identify actionable insights from large databases. Compare results from various methodologies and recommend optimal techniques. Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories. Develop and embed automated processes for predictive model validation, deployment, and implementation. Work on multiple pillars of AI, including cognitive engineering, conversational bots, and data science. Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment. Lead discussions at peer reviews and use interpersonal skills to positively influence decision making. Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices. Facilitate cross-geography sharing of new ideas, learnings, and best practices. Required Qualifications: Educational Requirement : Bachelor of Science or Bachelor of Engineering (at a minimum). Experience : 4+ years of work experience as a Data Scientist. Skills : A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to quickly cycle hypotheses through the discovery phase of a project. Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala). Good hands-on skills in both feature engineering and hyperparameter optimization. Experience producing high-quality code, tests, and documentation. Experience with Microsoft Azure or AWS data management tools such as Azure Data Factory, Data Lake, Azure ML, Synapse, Databricks. Understanding of descriptive and exploratory statistics, predictive modeling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and deep learning methodologies. Proficiency in statistical concepts and machine learning algorithms. Good knowledge of Agile principles and processes. Ability to lead, manage, build, and deliver customer business results through data scientists or professional services teams. Ability to share ideas compellingly, summarize and communicate data analysis assumptions and results. Self-motivated and a proactive problem solver who can work independently and in teams. Show more Show less

Posted 1 week ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Must have minimum 6 years of relevant experience in IT Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Proficiency in DBT (Data Build Tool) for data transformation and modelling Experience with ETL/ELT processes and integrating data from multiple sources. Experience in designing Tableau dashboards, data visualizations, and reports Familiarity with data warehousing concepts and best practices Strong problem-solving skills and ability to work in cross-functional teams.

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 4+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must be strong in SQL Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 1 week ago

Apply

Exploring Hive Jobs in India

Hive is a popular data warehousing tool used for querying and managing large datasets in distributed storage. In India, the demand for professionals with expertise in Hive is on the rise, with many organizations looking to hire skilled individuals for various roles related to data processing and analysis.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi

These cities are known for their thriving tech industries and offer numerous opportunities for professionals looking to work with Hive.

Average Salary Range

The average salary range for Hive professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

Typically, a career in Hive progresses from roles such as Junior Developer or Data Analyst to Senior Developer, Tech Lead, and eventually Architect or Data Engineer. Continuous learning and hands-on experience with Hive are crucial for advancing in this field.

Related Skills

Apart from expertise in Hive, professionals in this field are often expected to have knowledge of SQL, Hadoop, data modeling, ETL processes, and data visualization tools like Tableau or Power BI.

Interview Questions

  • What is Hive and how does it differ from traditional databases? (basic)
  • Explain the difference between HiveQL and SQL. (medium)
  • How do you optimize Hive queries for better performance? (advanced)
  • What are the different types of tables supported in Hive? (basic)
  • Can you explain the concept of partitioning in Hive tables? (medium)
  • What is the significance of metastore in Hive? (basic)
  • How does Hive handle schema evolution? (advanced)
  • Explain the use of SerDe in Hive. (medium)
  • What are the various file formats supported by Hive? (basic)
  • How do you troubleshoot performance issues in Hive queries? (advanced)
  • Describe the process of joining tables in Hive. (medium)
  • What is dynamic partitioning in Hive and when is it used? (advanced)
  • How can you schedule jobs in Hive? (medium)
  • Discuss the differences between bucketing and partitioning in Hive. (advanced)
  • How do you handle null values in Hive? (basic)
  • Explain the role of the Hive execution engine in query processing. (medium)
  • Can you give an example of a complex Hive query you have written? (advanced)
  • What is the purpose of the Hive metastore? (basic)
  • How does Hive support ACID transactions? (medium)
  • Discuss the advantages and disadvantages of using Hive for data processing. (advanced)
  • How do you secure data in Hive? (medium)
  • What are the limitations of Hive? (basic)
  • Explain the concept of bucketing in Hive and when it is used. (medium)
  • How do you handle schema evolution in Hive? (advanced)
  • Discuss the role of Hive in the Hadoop ecosystem. (basic)

Closing Remark

As you explore job opportunities in the field of Hive in India, remember to showcase your expertise and passion for data processing and analysis. Prepare well for interviews by honing your skills and staying updated with the latest trends in the industry. Best of luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies