Home
Jobs

3233 Databricks Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. Role: Data Architect Experience: 10+ years Location: Mumbai/Chennai/Indore Role Overview The Head of Data Engineering is responsible for building and maintaining scalable data pipelines, data lakes, and real-time analytics infrastructure to support AI, automation, and business intelligence across the enterprise. Key Responsibilities • Design and implement data pipelines (ETL/ELT) and real-time data streaming. • Manage data lakes, warehouses, and enterprise data platforms. • Ensure data is accessible, clean, and optimized for AI/ML applications. • Implement data governance, security, and compliance best practices. • Enable real-time data analytics using big data technologies (Kafka, Spark, Snowflake, Databricks). Key Qualifications • 10+ years in data engineering, cloud data platforms, or big data architecture. • Strong expertise in SQL, NoSQL, Apache Spark, Kafka, Snowflake, and cloud data services. • Experience with data governance frameworks (GDPR, HIPAA, etc.). Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Tittle : GCP Cloud Operations Engineer Location : Hyderabad (hybrid)- India About You: The GCP CloudOps Engineer is accountable for a continuous, repeatable, secure, and automated deployment, integration, and test solutions utilizing Infrastructure as Code (IaC) and DevSecOps techniques 8+ years of hands-on experience in infrastructure design, implementation, and delivery 3+ years of hands-on experience with monitoring tools (Datadog, New Relic, or Splunk) 4+ years of hands-on experience with Container orchestration services, including Docker or Kubernetes, GKE. Experience with working across time zones and with different cultures. 5+ years of hands-on experience in Cloud technologies GCP is preferred. Maintain an outstanding level of documentation, including principles, standards, practices, and project plans. Having experience building a data warehouse using Databricks is a huge plus. Hands-on experience with IaC patterns and practices and related automation tools such as Terraform, Jenkins, Spinnaker, CircleCI, etc., built automation and tools using Python, Go, Java, or Ruby. Deep knowledge of CICD processes, tools, and platforms like GitHub workflows and Azure DevOps. Proactive collaborator and can work in cross-team initiatives with excellent written and verbal communication skills. Experience with automating long-term solutions to problems rather than applying a quick fix. Extensive knowledge of improving platform observability and implementing optimizations to monitoring and alerting tools. Experience measuring and modeling cost and performance metrics of cloud services and establishing a vision backed by data. Develop tools and CI/CD framework to make it easier for teams to build, configure, and deploy applications Contribute to Cloud strategy discussions and decisions on overall Cloud design and best approach for implementing Cloud solutions Follow and Develop standards and procedures for all aspects of a Digital Platform in the Cloud Identify system enhancements and automation opportunities for installing/maintaining digital platforms Adhere to best practices on Incident, Problem, and Change management Implementing automated procedures to handle issues and alerts proactively Experience with debugging applications and a deep understanding of deployment architectures. Pluses: Databricks Experience with the Multicloud environment (GCP, AWS, Azure), GCP is the preferred cloud provider. Experience with GitHub and GitHub Actions  Thanks and Regards, Sandeep Reddy Senior Resource Coordinator Intune Systems Inc. 📞 USA: +1 214-230-2747 📞 India (WhatsApp & Call): +91 98857 57527 🏢 Address: 3620 N Josey Ln, #220C Carrollton TX 75007 Show more Show less

Posted 1 week ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Pune

Work from Office

Naukri logo

Role Summary : We are seeking a highly skilled Senior Data Science Consultant with 8+ years of experience to lead an internal optimization initiative. The ideal candidate should have a strong background in data science, operations research, and mathematical optimization, with a proven track record of applying these skills to solve complex business problems. This role requires a blend of technical depth, business acumen, and collaborative communication. A background in internal efficiency/operations improvement or cost/resource optimization projects is highly desirable. Key Responsibilities : - Lead and contribute to internal optimization-focused data science projects from design to deployment. - Develop and implement mathematical models to optimize resource allocation, process performance, and decision-making. - Use techniques such as linear programming, mixed-integer programming, heuristic and metaheuristic algorithms. - Collaborate with business stakeholders to gather requirements and translate them into data science use cases. - Build robust data pipelines and use statistical and machine learning methods to drive insights. - Communicate complex technical findings in a clear, concise manner to both technical and non-technical audiences. - Mentor junior team members and contribute to knowledge sharing and best practices within the team. Required Skills And Qualifications : - Masters or PhD in Data Science, Computer Science, Operations Research, Applied Mathematics, or related fields. - Minimum 8 years of relevant experience in data science, with a strong focus on optimization. - Expertise in Python (NumPy, Pandas, SciPy, Scikit-learn), SQL, and optimization libraries such as PuLP, Pyomo, Gurobi, or CPLEX. - Experience with end-to-end lifecycle of internal optimization projects. - Strong analytical and problem-solving skills. - Excellent communication and stakeholder management abilities. Preferred Qualifications : - Experience working on internal company projects focused on logistics, resource planning, workforce optimization, or cost reduction. - Exposure to tools/platforms like Databricks, Azure ML, or AWS SageMaker. - Familiarity with dashboards and visualization tools like Power BI or Tableau. - Prior experience in consulting or internal centers of excellence (CoE) is a plus.

Posted 1 week ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Mumbai

Work from Office

Naukri logo

Role Summary : We are seeking a highly skilled Senior Data Science Consultant with 8+ years of experience to lead an internal optimization initiative. The ideal candidate should have a strong background in data science, operations research, and mathematical optimization, with a proven track record of applying these skills to solve complex business problems. This role requires a blend of technical depth, business acumen, and collaborative communication. A background in internal efficiency/operations improvement or cost/resource optimization projects is highly desirable. Key Responsibilities : - Lead and contribute to internal optimization-focused data science projects from design to deployment. - Develop and implement mathematical models to optimize resource allocation, process performance, and decision-making. - Use techniques such as linear programming, mixed-integer programming, heuristic and metaheuristic algorithms. - Collaborate with business stakeholders to gather requirements and translate them into data science use cases. - Build robust data pipelines and use statistical and machine learning methods to drive insights. - Communicate complex technical findings in a clear, concise manner to both technical and non-technical audiences. - Mentor junior team members and contribute to knowledge sharing and best practices within the team. Required Skills And Qualifications : - Masters or PhD in Data Science, Computer Science, Operations Research, Applied Mathematics, or related fields. - Minimum 8 years of relevant experience in data science, with a strong focus on optimization. - Expertise in Python (NumPy, Pandas, SciPy, Scikit-learn), SQL, and optimization libraries such as PuLP, Pyomo, Gurobi, or CPLEX. - Experience with end-to-end lifecycle of internal optimization projects. - Strong analytical and problem-solving skills. - Excellent communication and stakeholder management abilities. Preferred Qualifications : - Experience working on internal company projects focused on logistics, resource planning, workforce optimization, or cost reduction. - Exposure to tools/platforms like Databricks, Azure ML, or AWS SageMaker. - Familiarity with dashboards and visualization tools like Power BI or Tableau. - Prior experience in consulting or internal centers of excellence (CoE) is a plus.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

_VOIS Intro About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Ey Responsibilities Design and Build Data Pipelines:** Develop scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Create efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Build and manage applications using Python, SQL, Databricks, and various AWS technologies. Utilize QuickSight to create insightful data visualizations and dashboards. Quickly develop innovative Proof-of-Concept (POC) solutions to address emerging needs. Provide support and manage the ongoing operation of data services. Automate repetitive tasks and build reusable frameworks to improve efficiency. Work with teams to design and develop data products that support marketing and other business functions. Ensure data services are reliable, maintainable, and seamlessly integrated with existing systems. Required Skills And Experience Bachelor’s degree in Computer Science, Engineering, or a related field. Technical Skills: Proficiency in Python with Pandans, PySpark. Hands-on experience with AWS services including S3, Glue Lambda, API Gateway, and SQS. Knowledge of data processing tools like Spark, Hive, Kafka, and Airflow. Experience with batch job scheduling and managing data dependencies. Experience with QuickSight or similar tools. Familiarity with DevOps automation tools like GitLab, Bitbucket, Jenkins, and Maven. Understanding of Delta is would be an added advantage. _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Working with event based / streaming technologies to ingest and process data Working with other members of the project team to support delivery of additional project components (API interfaces, Search). Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Strong knowledge of Data Management principles Experience in building ETL / data warehouse transformation processes Direct experience of building data piplines using Databricks. Experience using geospatial frameworks on Apache Spark and associated design and development patterns Experience working in a Dev/Ops environment with tools such as Terraform Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Organizations everywhere struggle under the crushing costs and complexities of “solutions” that promise to simplify their lives. To create a better experience for their customers and employees. To help them grow. Software is a choice that can make or break a business. Create better or worse experiences. Propel or throttle growth. Business software has become a blocker instead of ways to get work done. There’s another option. Freshworks. With a fresh vision for how the world works. At Freshworks, we build uncomplicated service software that delivers exceptional customer and employee experiences. Our enterprise-grade solutions are powerful, yet easy to use, and quick to deliver results. Our people-first approach to AI eliminates friction, making employees more effective and organizations more productive. Over 72,000 companies, including Bridgestone, New Balance, Nucor, S&P Global, and Sony Music, trust Freshworks’ customer experience (CX) and employee experience (EX) software to fuel customer loyalty and service efficiency. And, over 4,500 Freshworks employees make this possible, all around the world. Fresh vision. Real impact. Come build it with us. Job Description As a member of the Data Platform team, you'll be at the forefront of transforming how Freshworks Datalake can harnessed to the fullest in making data-driven decisions Key job responsibilities: Drive the backbone of our data platform by building robust pipelines that turn complex data into actionable insights using AWS, Databricks platform Be a data detective by ensuring our data is clean, accurate, and trustworthy Write clean, efficient code that handles massive amounts of structured and unstructured data Qualifications Must be proficient in the major languages such as Scala and Kafka (any variant). Write elegant and maintainable code, and you need to be comfortable with picking up new technologies. Proficient in working with distributed systems and have experience with different distributed processing frameworks that can handle data in batch and near real-time e.g. Spark etc. Experience on working with various AWS services and Databricks to build end-to-end data solutions that bring different systems together Requires 8–15 years of experience in a related field. Additional Information At Freshworks, we are creating a global workplace that enables everyone to find their true potential, purpose, and passion irrespective of their background, gender, race, sexual orientation, religion and ethnicity. We are committed to providing equal opportunity for all and believe that diversity in the workplace creates a more vibrant, richer work environment that advances the goals of our employees, communities and the business. Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At Juniper, we believe the network is the single greatest vehicle for knowledge, understanding, and human advancement the world has ever known. To achieve real outcomes, we know that experience is the most important requirement for networking teams and the people they serve. Delivering an experience-first, AI-Native Network pivots on the creativity and commitment of our people. It requires a consistent and committed practice, something we call the Juniper Way. Position: Supply Chain Data Engineer Experience :7+ years Location: Bangalore About the Position: Juniper's Supply Chain Operations is a data-driven organization and the demand for Data Engineering, Data Science and Analytics solutions for decision-making has increased 4x over the last 3 years. In addition, continuous changes in regulatory environment and geo-political issues call for a very flexible and resilient supply chain requiring many new data driven use cases. We need a self-motivated team player for this critical role in the Data Analytics Team to continue to satisfy and fulfill the growing demand for data and data driven solutions including developing AI solutions on top of SCO data stack. Responsibilities: As a member of the SCO Analytics team, this role will be responsible for implementing and delivering Business Intelligence initiatives in supply chain operations. This role will be responsible for collaborating with key business users, developing key metrics and reports and preparing the underlying data using new automated data preparation tools like Alteryx. etc. This role will also interface with Juniper Enterprise IT for seamless delivery of integrated solutions. Major responsibilities include leading/delivering Data Science & Business Intelligence initiatives in supply chain operations, collaborating with key business users, developing insightful analytical models, metrics and reports, coordinating with Juniper Enterprise IT for seamless delivery of system-based solutions. Minimum Qualifications: Bachelor’s degree 7 + years Hands on skills and understanding of Reporting Solutions and Data Models Building end-end Data Engineering pipelines for Semi and unstructured data (Text, all kinds of simple/complex table structures, images, video and audio data) Python, Pyspark, SQL, RDBMS Data Transformation (ETL/ELT) activities SQL Data warehouse (e.g. Snowflake) working / preferably administration Techno-functional system analysis skills including requirements documentation, use case definition, testing methodologies Experience in managing Data Quality and Data Catalog solutions Ability to learn and adapt the Juniper end to end business processes Strong interpersonal, written and verbal communication Preferred Qualifications: Working Experience in analytics solutions like Snowflake, Tableau, Databricks, Alteryx and SAP Business Objects Tools is preferred. Understanding of Supply Chain business processes and its integration with other areas of business Personal Skills: Ability to collaborate cross-functionally and build sound working relationships within all levels of the organization Ability to handle sensitive information with keen attention to detail and accuracy. Passion for data handling ethics. Effective time management skills and ability to solve complex technical problems with creative solutions while anticipating stakeholder needs and helping meet or exceed expectations Comfortable with ambiguity and uncertainty of change when assessing needs for stakeholders Self-motivated and innovative; confident when working independently, but an excellent team player with a growth-oriented personality Other Information: Relocation is not available for this position Travel requirements for the position 10% About Juniper Networks Juniper Networks challenges the inherent complexity that comes with networking and security in the multicloud era. We do this with products, solutions and services that transform the way people connect, work and live. We simplify the process of transitioning to a secure and automated multicloud environment to enable secure, AI-driven networks that connect the world. Additional information can be found at Juniper Networks (www.juniper.net) or connect with Juniper on Twitter, LinkedIn and Facebook. WHERE WILL YOU DO YOUR BEST WORK? Wherever you are in the world, whether it's downtown Sunnyvale or London, Westford or Bengaluru, Juniper is a place that was founded on disruptive thinking - where colleague innovation is not only valued, but expected. We believe that the great task of delivering a new network for the next decade is delivered through the creativity and commitment of our people. The Juniper Way is the commitment to all our colleagues that the culture and company inspire their best work-their life's work. At Juniper we believe this is more than a job - it's an opportunity to help change the world. At Juniper Networks, we are committed to elevating talent by creating a trust-based environment where we can all thrive together. If you think you have what it takes, but do not necessarily check every single box, please consider applying. We’d love to speak with you. Additional Information for United States jobs: ELIGIBILITY TO WORK AND E-VERIFY In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire. Juniper Networks participates in the E-Verify program. E-Verify is an Internet-based system operated by the Department of Homeland Security (DHS) in partnership with the Social Security Administration (SSA) that allows participating employers to electronically verify the employment eligibility of new hires and the validity of their Social Security Numbers. Information for applicants about E-Verify / E-Verify Información en español: This Company Participates in E-Verify / Este Empleador Participa en E-Verify Immigrant and Employee Rights Section (IER) - The Right to Work / El Derecho a Trabajar E-Verify® is a registered trademark of the U.S. Department of Homeland Security. Juniper is an Equal Opportunity workplace. We do not discriminate in employment decisions on the basis of race, color, religion, gender (including pregnancy), national origin, political affiliation, sexual orientation, gender identity or expression, marital status, disability, genetic information, age, veteran status, or any other applicable legally protected characteristic. All employment decisions are made on the basis of individual qualifications, merit, and business need. Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About The Position Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. A Data Engineer designs data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. Key Responsibilities Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric). Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 & Azure Blob storage). Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize data pipelines in the Azure environment for performance, scalability, and reliability. Ensure data quality and integrity through data validation techniques and frameworks. Develop and maintain documentation for data processes, configurations, and best practices. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge. Manage the CI/CD process for deploying and maintaining data solutions. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience) and able to demonstrate high proficiency in programming fundamentals. 3-5 years experience At least 2 years of proven experience as a Data Engineer or similar role dealing with data and ETL processes. Strong knowledge of Microsoft Azure services, including Azure Data Factory, Azure Synapse, Azure Databricks, Azure Blob Storage and Azure Data Lake Gen 2. Experience utilizing SQL DML to query modern RDBMS in an efficient manner (e.g., SQL Server, PostgreSQL). Strong understanding of Software Engineering principles and how they apply to Data Engineering (e.g., CI/CD, version control, testing). Experience with big data technologies (e.g., Spark). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications Learning agility Technical Leadership Consulting and managing business needs Strong experience in Python is preferred but experience in other languages such as Scala, Java, C#, etc is accepted. Experience building spark applications utilizing PySpark. Experience with file formats such as Parquet, Delta, Avro. Experience efficiently querying API endpoints as a data source. Understanding of the Azure environment and related services such as subscriptions, resource groups, etc. Understanding of Git workflows in software development. Using Azure DevOps pipeline and repositories to deploy and maintain solutions. Understanding of Ansible and how to use it in Azure DevOps pipelines. Chevron ENGINE supports global operations, supporting business requirements across the world. Accordingly, the work hours for employees will be aligned to support business requirements. The standard work week will be Monday to Friday. Working hours are 8:00am to 5:00pm or 1.30pm to 10.30pm. Chevron participates in E-Verify in certain locations as required by law. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

End-to-end development and delivery of MIS reports and dashboards supporting credit card and lending portfolio acquisition, early engagement, existing customer management, rewards, retention and attrition. Partner with business stakeholders to understand requirements and deliver actionable insights through automated reporting solutions. Maintain and optimize existing SAS-based reporting processes while leading the migration to Python/PySpark on Big Data platforms. Design and build interactive dashboards in Tableau for senior leadership and regulatory reporting. Build and implement an automated audit framework to ensure data accuracy, completeness and consistency across the entire reporting life cycle. Collaborate with Data Engineering and IT teams to leverage data lakes and enterprise data platforms. Mentor junior analysts and contribute to knowledge sharing across teams. Support ad-hoc analysis and audits with quick turnaround and attention to data integrity. Qualifications: Experience in MIS reporting, data analytics, or BI in the Banking/Financial Services, with a strong focus on Credit Cards. Proficiency in SAS for data extraction, manipulation, and automation. Advanced skills in Python and PySpark, particularly in big data environments (e.g., Hadoop, Hive, Databricks). Expertise in Tableau for dashboard design and data storytelling. ------------------------------------------------------ Job Family Group: Management Development Programs ------------------------------------------------------ Job Family: Undergraduate ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 24 Lacs

Pune

Hybrid

Naukri logo

Hello Folks, We are currently hiring for Developer role Only Graduate Student Exp: 5+ yrs Mandatory Skills: SQL, No SQL Spark Databricks Kafka Opentable DBS Java microservices Python Development NP: 1 month to 15 days (Only serving NP ) Excellent Communication SKills This is C2H role . Interested Candidates Share Resume at dipti.bhaisare@in.experis.com

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio. Join the team as our next Staff Software Engineer, Warehouse Activation About The Job This position is needed to lead in a technical capacity a team focused on Twilio Segment Warehouse Activation. As a Staff Software Engineer in the Warehouse Activation group, you’ll build and scale systems that process 1B+ rows per month, helping our customers unlock value of our Customer Data Platform (CDP). The team you will help lead will have a deep understanding of large distributed systems and data processing at scale. In addition, the systems you manage connect directly to customer Data Warehouses such as Snowflake and Databricks. In your role you will be responsible for understanding Data Warehouse APIs and how to best integrate with these systems at high scale. We iterate quickly on these products and features and learn new things daily — all while writing quality code. We work closely with product and design and solve some of the toughest engineering problems to unlock new possibilities for our customers. If you get excited by building products with high customer impact — this is the place for you. Responsibilities In this role, you will: Design and build the next generation of Warehouse Activation platform, process billions of events, and power various use cases for Twilio Segment customers. This encompasses working on stream data processing, storage, and other mission-critical systems. Ship features that opt for high availability and throughput with eventual consistency Collaborate with engineering and product leads, as well as teams across Twilio Segment Support the reliability and security of the platform Build and optimize globally available and highly scalable distributed systems Be able to act as a team Tech Lead as needed Mentor other engineers on the team in technical architecture and design Partner with application teams to deliver end to end customer success. Qualifications Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 8+ years of experience writing production-grade code in a modern programming language Strong theoretical fundamentals and hands-on experience designing and implementing highly available and performant fault-tolerant distributed systems. Experience programming in one or more of the following: Go, Java, Scala, or similar languages Well-versed in concurrent programming, along with a solid grasp of Linux systems and networking concepts. Experience operating large-scale, distributed systems on top of cloud infrastructure such as Amazon Web Services (AWS) or Google Cloud Platform (GCP) Experience in message passing systems (e.g., Kafka, AWS Kinesis) and/or modern stream processing systems (e.g., Spark, Flink). Have hands-on experience with container orchestration frameworks (e.g. Kubernetes, EKS, ECS) Experience shipping services (products) following CI/CD development paradigm. Deep understanding of architectural patterns of high-scale web applications (e.g., well-designed APIs, high volume data pipelines, efficient algorithms) Ideally domain expertise in the Modern Data stack with experience in developing cloud-based data solution components and architecture covering data ingestion, data processing and data storage Have a track record of successfully leading teams, large projects, or owned and built an important, complex system end to end, delivered iteratively. Excellent written and verbal technical communication skills to convey complex technical concepts effectively. Location : This role will be remote and based in India(Karnataka, Maharashtra, New Delhi, Tamil nadu and Telangana) What We Offer There are many benefits to working at Twilio, including things like competitive pay, generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio. Join the team as our next Software Engineer(L3), Warehouse Activation. About The Job This position is needed to manage and grow a team focused on Twilio Segment Warehouse Activation. As a Software Engineer(L3) in the Warehouse Activation group, you’ll build and scale systems that process 1B+ rows per month, helping our customers unlock value of our customer data platform (CDP). The team you will contribute tohelp lead will have a deep understanding of large distributed systems and data processing at scale. In addition, the systems you manage connect directly to customer Data Warehouses such as Snowflake and Databricks. In your role you will be responsible for understanding Data Warehouse APIs and how to best integrate with these systems at high scale. We iterate quickly on these products and features and learn new things daily — all while writing quality code. We work closely with product and design and solve some of the toughest engineering problems to unlock new possibilities for our customers. If you get excited by building products with high customer impact — this is the place for you. Responsibilities In this role, you will: Design and build the next generation of Warehouse Activation platform, process billions of events, and power various use cases for Twilio Segment customers. This encompasses working on stream data processing, storage, and other mission-critical systems Ship features that opt for high availability and throughput with eventual consistency Collaborate with engineering and product leads, as well as teams across Twilio Segment Support the reliability and security of the platform Build and optimize globally available and highly scalable distributed systems Mentor other engineers on the team in technical architecture and design Partner with application teams to deliver end to end customer success. Qualifications Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 5+ years of experience writing production-grade code in a modern programming language Strong theoretical fundamentals and hands-on experience designing and implementing highly available and performant fault-tolerant distributed systems. Experience programming in one or more of the following: Go, Java, Scala, or similar languages Well-versed in concurrent programming, along with a solid grasp of Linux systems and networking concepts. Experience operating large-scale, distributed systems on top of cloud infrastructure such as Amazon Web Services (AWS) or Google CloudCompute Platform (GCP) Experience in message passing systems (e.g., Kafka, AWS Kinesis) and/or modern stream processing systems (e.g., Spark, Flink). Have hands-on experience with container orchestration frameworks (e.g. Kubernetes, EKS, ECS) Experience shipping services (products) following CI/CD development paradigm. Strong understanding of architectural patterns of high-scale web applications (e.g., well-designed APIs, high volume data pipelines, efficient algorithms) Ideally domain expertise in the Modern Data stack with experience in developing cloud-based data solution components and architecture covering data ingestion, data processing and data storage Have a track record of successfully delivering on large or complex projects delivered iteratively. Strong written and verbal technical communication skills to convey complex technical concepts effectively. Comfortable asking questions and taking initiative to solve problems where it is often necessary to “draw the owl” Location : This role will be remote and based in India(Karnataka, Maharashtra, New Delhi, Tamil nadu and Telangana) What We Offer There are many benefits to working at Twilio, including things like competitive pay, generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description: The Future Begins Here: At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity: Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. The Opportunity: As a Data Scientist, you will have the opportunity to apply your analytical skills and expertise to extract meaningful insights from vast amounts of data. We are currently seeking a talented and experienced individual to join our team and contribute to our data-driven decision-making process. Objectives: Collaborate with different business users, mainly Supply Chain/Manufacturing to understand the current state and identify opportunities to transform the business into a data-driven organization. Translate processes, and requirements into analytics solutions and metrics with effective data strategy, data quality, and data accessibility for decision making. Operationalize decision support solutions and drive use adoption as well as gathering feedback and metrics on Voice of Customer in order to improve analytics services. Understand the analytics drivers and data to be modeled as well as apply the appropriate quantitative techniques to provide business with actionable insights and ensure analytics model and data are access to the end users to evaluate “what-if” scenarios and decision making. Evaluate the data, analytical models, and experiments periodically to validate hypothesis ensuring it continues to provide business value as requirements and objectives evolve. Accountabilities: Collaborates with business partners in identifying analytical opportunities and developing BI-related goals and projects that will create strategically relevant insights. Work with internal and external partners to develop analytics vision and programs to advance BI solutions and practices. Understands data and sources of data. Strategizes with IT development team and develops a process to collect, ingest, and deliver data along with proper data models for analytical needs. Interacts with business users to define pain points, problem statement, scope, and analytics business case. Develops solutions with recommended data model and business intelligence technologies including data warehouse, data marts, OLAP modeling, dashboards/reporting, and data queries. Works with DevOps and database teams to ensure proper design of system databases and appropriate integration with other enterprise applications. Collaborates with Enterprise Data and Analytics Team to design data model and visualization solutions that synthesize complex data for data mining and discovery. Assists in defining requirements and facilitates workshops and prototyping sessions. Develops and applies technologies such as machine-learning, deep-learning algorithm to enable advanced analytics product functionality. EDUCATION, BEHAVIOURAL COMPETENCIES AND SKILLS : Bachelors’ Degree, from an accredited institution in Data Science, Statistics, Computer Science, or related field. 3+ years of experience with statistical modeling such as clustering, segmentation, multivariate, regression, etc. and analytics tools such as R, Python, Databricks, etc. required Experience in developing and applying predictive and prescriptive modeling, deep-learning, or other machine learning techniques a plus. Hands-on development of AI solutions that comply with industry standards and government regulations. Great numerical and analytical skills, as well as basic knowledge of Python Analytics packages (Pandas, scikit-learn, statsmodels). Ability to build and maintain scalable and reliable data pipelines that collect, transform, manipulate, and load data from internal and external sources. Ability to use statistical tools to conduct data analysis and identify data quality issues throughout the data pipeline. Experience with BI and Visualization tools (f. e. Qlik, Power BI), ETL, NoSQL and proven design skills a plus. Excellent written and verbal communication skills including the ability to interact effectively with multifunctional teams. Experience with working with agile teams. WHAT TAKEDA CAN OFFER YOU: Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bengaluru will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. BENEFITS: It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs including annual health screening, weekly health sessions for employees. Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 calendar days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. Locations: IND - Bengaluru Worker Type: Employee Worker Sub-Type: Regular Time Type: Full time Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Identifying business problems, understand the customer issue and fix the issue. Evaluating reoccurring issues and work on for permanent solution Focus on service improvement. Troubleshooting technical issues and design flaws Preferred Education Master's Degree Required Technical And Professional Expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 3-5 years of experience Azure IAAS, PASS & SAAS services expert and Handson experience on these and below all services. VM, Storage account, Load Balancer, Application Gateway, VNET, Route Table, Azure Bastion, Disaster Recovery, Backup, NSG, Azure update manager, Key Vault etc. Azure Web App, Function App, Logic App, AKS (Azure Kubernetes Service) & containerization, Docker, Event Hub, Redis Cache, Service Mess and ISTIO, App insight, Databricks, AD, DNS, Log Analytic Workspace, ARO (Azure Red Openshift) Orchestration & Containerization Docker, Kubernetes, RedHat OpenShift Security Management - Firewall Mgmt, FortiGate firewall Preferred Technical And Professional Experience Monitoring through Cloud Native tools (CloudWatch, Cloud Trail, Azure Monitor, Activity Log, VRops and Log Insight) Server monitoring and Management (Windows, Linux, AIX AWS Linux, Ubuntu Linux) Storage Monitoring and Management (Blob, s3, EBS, Backups, recovery, Snapshots Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner . What You’ll Be DOING What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to ensure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Pyspark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to ensure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to Technical Lead. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products(ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 1 week ago

Apply

0.0 - 3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. What You’ll Do An entry level associate consultant will be responsible for implementing data warehousing and reporting solutions based on SAP Analytics or Data Management products like SAP Business Data Cloud (SAP BDC), SAP Datasphere, SAP HANA, or SAP Analytics Cloud (SAC). Interact with the lead to understand requirement and build objects as per given design. Responsible for technical implementation of the business models, replication of data from source systems and creating dashboards or other visualizations using one of the above SAP solutions. Understand defined project methodologies and follow them during execution. Understand technical design and build the objects as per the design using SAP products. Work as part of large team under guidance from a technical lead. Flexibility to work on various skills and proactive approach to take ownership. Reliability to deliver on time and with high quality. What You Bring Working experience of 0-3 years in software industry with a degree in computer science, engineering, accounting, finance or related field. Hands-on Experience in one of the below skills: SAP Analytics portfolio (SAC, HANA Modelling, BW, Datasphere) Experience in SAP Data Management or other Data Management Tools ABAP Development experience. Databricks AI Experience BTP Application Development Experience Node.js Experience. Experience working on simple business scenarios and implement it with a given design. Experience of coding and database design. Understanding of database objects like tables, schemas, views and ER models. Knowledge of analytics concepts, data warehousing or any reporting or dashboarding. Advanced knowledge of SQL scripting with ability to write stored procedures. Knowledge of SAP Analytics and/or Data Management portfolio (e.g. SAC, HANA, BW, Datasphere etc.) or knowledge of ABAP development will be preferred Good grasp of programming concepts and knowledge in JavaScript, Python Ability to understand business requirements, derive logic and implement it technically either as code or as a model. Good problem solving and troubleshooting skills. Good oral and written communication and interpersonal skills Location: Bangalore or Gurgaon. Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 423048 | Work Area: Consulting and Professional Services | Expected Travel: 0 - 10% | Career Status: Graduate | Employment Type: Regular Full Time | Additional Locations: . Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Associate Specialist - Data Delivery & Operations Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL's executive leadership team to maximize benefits and facilitate sustained enterprise advantage. Our Innovation, Data, and Analytics Office (IDA) is focused on driving innovation by optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward a greater focus on the use of data and data-driven insights, we are seeking an Associate Scientist for our Data Sourcing & Solutions team. The role sits across the IDA Department to make sure customer requirements are properly captured and transformed into actionable data specifications. Success in the role will require a focus on proactive management of the sourcing and management of data from source through usage. What You’ll Be DOING What will your essential responsibilities include? Accountable for documenting data requirements (Business and Function Requirements) and assessing the reusability of Axiom assets. Build processes to simplify and expedite data sourcing to focus on delivering data to AXA XL business stakeholders frequently. Develops and operationalizes strategic data products and answers and proactively manages the sourcing and management of data from source through usage (reusable Policy and Claim Domain data assets). Data Validation Testing of the data products in partnership with the AXA XL business to ensure the accuracy of the data and validation of the requirements. Assesses all data required as part of the Data Ecosystem to make sure data has a single version of the truth. Respond to ad-hoc data requests to support AXA XL's business. Instill a customer-first attitude, prioritizing service for our business stakeholders above all else. Internalize and execute IDA and company-wide goals to become a data-driven organization. Contribute to best practices and standards to make sure there is a consistent and efficient approach to capturing business requirements and translating them into functional, non-functional, and semantic specifications. Develop a comprehensive understanding of the data and our customers. Drive root cause analysis for identified data deficiencies within reusable data assets delivered via IDA. Identify solution options to improve the consistency, accuracy, and quality of data when captured at its source. You will report to the Senior Scientist- Data Sourcing & Delivery & Operations. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities A minimum of a bachelor’s or master's degree (preferred) in a relevant discipline. Experience in a data role (business analyst, data analyst, analytics) preferably in the Insurance industry and within a data division. Robust SQL knowledge and technical ability to query AXA XL data sources to understand our data. Excellent presentation, communication (oral & written), and relationship-building skills, across all levels of management and customer interaction. Insurance experience in data, underwriting, claims, and/or operations, including influencing, collaborating, and leading efforts in complex, disparate, and interrelated teams with competing priorities. Passion for data and experience working within a data-driven organization. Work together internal data with external industry data to deliver holistic answers. Work with unstructured data to unlock information needed by the business to create unique products for the insurance industry. Possesses robust exploratory analysis skills and high intellectual curiosity. Displays exceptional organizational skills and is detail oriented. The robust conceptual thinker who 'connects dots', and has critical thinking, and analytical skills. Desired Skills And Abilities Ability to work with team members across the globe and departments. Ability to take ownership, work under pressure, and meet deadlines. Builds trust and rapport within and across groups. Applies in-depth knowledge of business and specialized areas to solve business problems and understand integration challenges and long-term impact creatively and strategically. Ability to manage data needs of an individual project(s) while being able to understand the broader enterprise data perspective. Expected to recommend innovation and improvement to policies, and procedures, deploying resources, and performing core activities. Experience with SQL Server, Azure Databricks Notebook, QlikView, Power BI, and Jira/Confluence a plus. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Identifying business problems, understand the customer issue and fix the issue. Evaluating reoccurring issues and work on for permanent solution Focus on service improvement. Troubleshooting technical issues and design flaws Working both individually and on a team to deliver work on time Preferred Education Master's Degree Required Technical And Professional Expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 3-5 years of experience Azure IAAS, PASS & SAAS services expert and Handson experience on these and below all services. VM, Storage account, Load Balancer, Application Gateway, VNET, Route Table, Azure Bastion, Disaster Recovery, Backup, NSG, Azure update manager, Key Vault etc. Azure Web App, Function App, Logic App, AKS (Azure Kubernetes Service) & containerization, Docker, Event Hub, Redis Cache, Service Mess and ISTIO, App insight, Databricks, AD, DNS, Log Analytic Workspace, ARO (Azure Red Openshift) Orchestration & Containerization Docker, Kubernetes, RedHat OpenShift Security Management - Firewall Mgmt, FortiGate firewall Preferred Technical And Professional Experience Monitoring through Cloud Native tools (CloudWatch, Cloud Trail, Azure Monitor, Activity Log, VRops and LogInsight) Server monitoring and Management (Windows, Linux, AIX AWS Linux, Ubuntu Linux) Storage Monitoring and Management (Blob, s3, EBS, Backups, recovery, Snapshots) Show more Show less

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Creating Passion: Your Responsibilities Design, implementation and optimization of ETL and data pipelines for business intelligence and analytics applications. Responsibility for monitoring and troubleshooting existing jobs and job control. Design and development of data quality checks to ensure data integrity. Developing and maintaining comprehensive documentation. Collaborating with data analysts, data scientists and business stakeholders in an international context. Contributing your strengths: Your qualifications Qualification and Education Requirements: Bachelor's degree in Computer Science, Data Science, or a related field. Strong conceptual and analytical skills, with a high level of quality awareness. Proven ability to work independently and collaboratively within a globally distributed team. Experience in data engineering and data warehousing, including practical implementation and management of data pipelines. Excellent communication and collaboration skills. Experience: 5-9 years of experience implementing and managing ETL and data pipelines. Preferred Skills / Special Skills: Extensive experience and proficiency in Microsoft SQL Server and SSIS (mandatory). Knowledge of development with Databricks, Azure Data Lake, and Python is advantageous; experience with cloud technologies and big data environments is a plus. Have we awoken your interest? Then we look forward to receiving your online application. If you have any questions, please contact Sonali Samal. One Passion. Many Opportunities. The Company Liebherr CMCtec India Private Limited in Pune (India) was established in 2008 and started its manufacturing plant in its own facility on Pune Solapur Highway in 2012. The company is responsible for the production of tower cranes and drives. Location Liebherr CMCtec India Private Limited Sai Radhe, 6th Floor, Behind Hotel Sheraton Grand, Raja Bahadur Mill Road, Pune, 411001 411001 Pune India (IN) Contact Sonali Samal sonali.samal@liebherr.com Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Currently we have an open position with our client - Its a IT Consulting Firm Principal Databricks Engineer/Architect. Key Responsibilities: 1. Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements. 2. Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks. 3. Data Analytics: Collaborate with data scientists and analysts to develop and deploy machine learning models and analytics solutions on Databricks. 4. Performance Optimization: Optimize Databricks cluster performance, ensuring efficient resource utilization and cost-effectiveness. 5. Security and Governance: Implement Databricks security features, ensure data governance, and maintain compliance with industry regulations. 6. Client Engagement: Work closely with clients to understand their business requirements, provide technical guidance, and deliver high-quality Databricks solutions. 7. Thought Leadership: Stay up-to-date with the latest Databricks features, best practices, and industry trends, and share knowledge with the team. Requirements: 1. Databricks Experience: 5+ years of experience working with Databricks, including platform architecture, data engineering, and data analytics. 2. Technical Skills: Proficiency in languages such as Python, Scala, or Java, and experience with Databricks APIs, Spark, and Delta Lake. 3. Data Engineering: Strong background in data engineering, including data warehousing, ETL, and data governance. 4. Leadership: Proven experience leading technical teams, mentoring junior engineers, and driving technical initiatives. 5. Communication: Excellent communication and interpersonal skills, with the ability to work effectively with clients and internal stakeholders. Good to Have: 1. Certifications: Databricks Certified Professional or similar certifications. 2. Cloud Experience: Experience working with cloud platforms such as AWS, Azure, or GCP. 3. Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

Exploring Databricks Jobs in India

Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect

Related Skills

In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools

Interview Questions

  • What is Databricks and how is it different from Apache Spark? (basic)
  • Explain the concept of lazy evaluation in Databricks. (medium)
  • How do you optimize performance in Databricks? (advanced)
  • What are the different cluster modes in Databricks? (basic)
  • How do you handle data skewness in Databricks? (medium)
  • Explain how you can schedule jobs in Databricks. (medium)
  • What is the significance of Delta Lake in Databricks? (advanced)
  • How do you handle schema evolution in Databricks? (medium)
  • What are the different file formats supported by Databricks for reading and writing data? (basic)
  • Explain the concept of checkpointing in Databricks. (medium)
  • How do you troubleshoot performance issues in Databricks? (advanced)
  • What are the key components of Databricks Runtime? (basic)
  • How can you secure your data in Databricks? (medium)
  • Explain the role of MLflow in Databricks. (advanced)
  • How do you handle streaming data in Databricks? (medium)
  • What is the difference between Databricks Community Edition and Databricks Workspace? (basic)
  • How do you set up monitoring and alerting in Databricks? (medium)
  • Explain the concept of Delta caching in Databricks. (advanced)
  • How do you handle schema enforcement in Databricks? (medium)
  • What are the common challenges faced in Databricks projects and how do you overcome them? (advanced)
  • How do you perform ETL operations in Databricks? (medium)
  • Explain the concept of MLflow Tracking in Databricks. (advanced)
  • How do you handle data lineage in Databricks? (medium)
  • What are the best practices for data governance in Databricks? (advanced)

Closing Remark

As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies