Home
Jobs

3233 Databricks Jobs - Page 32

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary Position Summary Job Title: Senior Data Scientist/Team Lead Job Summary: We are seeking a Senior Data Scientist with hand-on experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and lead a team to implement data-driven solutions. Key Responsibilities: Lead and deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Lead a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field Led a 3-5 member team on multiple end to end DS/ML projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300022 Show more Show less

Posted 1 week ago

Apply

2.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe action Job Title: Data Scientist/Machine Learning Engg Job Summary:We are seeking a Data Scientist with experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and collaborate with a team to implement data-driven solutions. Key Responsibilities : Deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Work with a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications : 2-7 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300100 Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Are you passionate about data science with a comprehensive understanding of analytical principles, tools, technologies, and the capability to convey insights to both executive and non-technical audiences? If so, this could be the ideal opportunity for you. As an Analytics Solutions Analyst on the Instrumentation & Metrics (I&M) team, you are in integral part of the team that will be responsible for leveraging your expertise in data science to develop and maintain production grade models using various analytical techniques. You will provide ad-hoc analytics support to the Payments organization, transforming complex data into actionable insights. Additionally, you will guide the team on best practices and techniques in data science, ensuring the effective use of data to drive business decisions. Job Responsibilities Develop complex analytical models that provide a comprehensive view of the business by integrating data from multiple sources. Utilize advanced SQL, Alteryx, Tableau, and Python programming skills, along with expertise in multivariate statistics, quantitative modeling, and advanced analytical methods (e.g., supervised/unsupervised machine learning, time-series predictions, NLP, etc.). Conduct data discovery and analytics to extract insights that enhance existing financial products and support decision-making processes. Collaborate with product managers, data architects, software engineers, and business analysts to build a company-centric analytics product in a production environment. Demonstrate excellent understanding of business strategy and data science opportunities. Communicate complex data challenges and solutions to diverse audiences across various levels of the banking organization, including those unfamiliar with advanced machine learning techniques. Required Qualifications, Capabilities And Skills Bachelor’s or Master’s degree in statistics, mathematics, data science, or a related technical or quantitative field, with 3+ years of applied experience. Strong understanding of agile methodologies, statistics, and AI/ML engineering, with a proven track record of developing and deploying business-critical machine learning models in production. Proficiency in programming languages such as Python, and experience with machine learning frameworks, libraries, and APIs, including TensorFlow, PyTorch, Scikit-learn, etc. Ability to identify and address AI/ML challenges, implement optimizations, and fine-tune models for optimal performance. Basic knowledge of data system components to determine necessary controls. Excellent written and verbal communication skills to effectively convey technical concepts and results to both technical and business audiences. Preferred Qualifications, Capabilities And Skills Familiarity with the financial services industry. Background in NLP and advanced analytics. Knowledge of financial products and services, including trading, investment, and risk management. Experience working with databricks/snowflake. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world. Show more Show less

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Summary Position Summary Job Title: Senior Data Scientist/Team Lead Job Summary: We are seeking a Senior Data Scientist with hand-on experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and lead a team to implement data-driven solutions. Key Responsibilities: Lead and deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Lead a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field Led a 3-5 member team on multiple end to end DS/ML projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300022 Show more Show less

Posted 1 week ago

Apply

2.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe action Job Title: Data Scientist/Machine Learning Engg Job Summary:We are seeking a Data Scientist with experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and collaborate with a team to implement data-driven solutions. Key Responsibilities : Deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Work with a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications : 2-7 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300100 Show more Show less

Posted 1 week ago

Apply

2.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe action Job Title: Data Scientist/Machine Learning Engg Job Summary:We are seeking a Data Scientist with experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and collaborate with a team to implement data-driven solutions. Key Responsibilities : Deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Work with a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications : 2-7 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300100 Show more Show less

Posted 1 week ago

Apply

140.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About NCR VOYIX NCR VOYIX Corporation (NYSE: VYX) is a leading global provider of digital commerce solutions for the retail, restaurant and banking industries. NCR VOYIX is headquartered in Atlanta, Georgia, with approximately 16,000 employees in 35 countries across the globe. For nearly 140 years, we have been the global leader in consumer transaction technologies, turning everyday consumer interactions into meaningful moments. Today, NCR VOYIX transforms the stores, restaurants and digital banking experiences with cloud-based, platform-led SaaS and services capabilities. Not only are we the leader in the market segments we serve and the technology we deliver, but we create exceptional consumer experiences in partnership with the world’s leading retailers, restaurants and financial institutions. We leverage our expertise, R&D capabilities and unique platform to help navigate, simplify and run our customers’ technology systems. Our customers are at the center of everything we do. Our mission is to enable stores, restaurants and financial institutions to exceed their goals – from customer satisfaction to revenue growth, to operational excellence, to reduced costs and profit growth. Our solutions empower our customers to succeed in today’s competitive landscape. Our unique perspective brings innovative, industry-leading tech to all the moving parts of business across industries. NCR VOYIX has earned the trust of businesses large and small — from the best-known brands around the world to your local favorite around the corner. Title :- Senior Software Engineering Manager – Data Engineering & Full stack Experience :- 12 Years – 15 Years Location :- Hyderabad/Gurgaon/Virtual YOU ARE… Passionate about technology and see the world a little differently than your peers. Everywhere you look, there’s possibility. Opportunity. Boundaries to push and challenges to solve. You believe software engineering changes how people live. At NCR Voyix, we believe that, too. We’re one of the world’s first tech companies, and still going strong. Like us, you know the online and mobile worlds better than any other—and see patterns that no one else sees. Our leadership team drives the delivery of products that provide optimal performance and stability with unsurpassed longevity with over 25 years in the Restaraunts, Retail, Payments & Services industry. We are looking for talented people to join our expanding our NCR Voyix Data and Analytics platform team. Our product as a cloud based SaaS solution is responsible for providing the foundation for NCR Voyix cloud-based Data and Analytics platform. Our primary customers are merchants you see and visit every day in the Retail, Grocery, and Hospitality industry. We experience the impact our work is having, and we take pride in providing services with great availability and ease of use. IN THIS ROLE, YOU CAN EXPECT TO… . The NCR Voyix Software Engineer will be responsible for front-end and back-end solution design, software development, code quality, data security, production readiness and performance tuning. The ideal candidate is an experienced software engineer who enjoys optimizing data systems and building them from the ground up. The Software Engineer will support database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives. The NCR Voyix Software Engineer contributes in the following: KEY AREAS OF RESPONSIBILITY: Lead team of talented developers and leads working on full stack frameworks and data engineering. Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. Mine and analyze data from different NCR data sources to drive optimization of operations, and improve customer experience. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Develop custom data models and algorithms to apply to data sets. Use predictive modeling to increase and optimize customer experiences, cost savings, actionable insights and other business outcomes. Develop company A/B testing framework and test model quality. Collaborate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Be part of an Agile team, participate in all Agile ceremonies & activities and be accountable for the sprint deliverable Create and maintain optimal data delivery architecture Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure and GCP ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data delivery needs. Keep our data separated and secure across national boundaries through multiple data centers and cloud regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. YOU HAVE… 15+ years of experience in software testing or software engineering 10+ years in non-functional automation & performance testing 10+ years in Public Cloud based engineering React.js understanding: Experience with React components, hooks, and state management. JavaScript/TypeScript knowledge Node.js: Expertise in server-side development using Node.js. RESTful APIs & GraphQL: Ability to design and consume APIs. Agile Methodologies: Experience in Agile, Scrum, or Kanban environments. UI/UX Principles: Basic understanding for effective collaboration with designers. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with structured and unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with ETL,and big data integration services: Confluent Kafka, BigQuery, Data Bricks, Data Factory, etc. Experience with relational SQL and NoSQL databases, including DataBricks, BigQuery, Azure Data Warehouse, etc. Experience with stream-processing systems: kSQL, Flink SQL, dbtLabs, DataBricks, Spark-Streaming, etc. Experience with object-oriented, functional and scripting languages: Python, Java, C#, Scala, etc. Experience with Dev Ops tools: CI & Dev Ops: GitHub, GitHub Actions, Jenkins, JIRA, Chef, Sonar Experience with unit testing, integration testing, performance testing and user acceptance testing BASIC QUALIFICATIONS: Strong inferential skills with an ability to succinctly communicate complex topics to business stakeholders. Experience with UI and full stack frameworks like ReactJS, NodeJS, Typescript, Material UI, SASS etc Experience using cloud platforms like Azure or GCP. Experience working with complex on-premise and cloud data architectures. GENERAL KNOWLEDGE, SKILLS AND ABILITIES: Exhibit leadership skills Azure or GCP Public Cloud Technologies In-depth knowledge of end-to-end systems development life cycles (including agile, iterative, and other modern approaches to software development) Outstanding verbal and written communication skills to technical and non-technical audiences of various levels in the organization (e.g., executive, management, individual contributors) Ability to estimate work effort for project sub-plans or small projects and ensure projects are successfully completed Quality assurance mindset Positive outlook, strong work ethic, and responsive to internal and external customers and contacts Willingly and successfully fulfils the role of teacher, mentor and coach Requires in-depth knowledge of networking, computing platform, storage, database, security, middleware, network and systems management, and related infrastructure technologies and practice Offers of employment are conditional upon passage of screening criteria applicable to the job EEO Statement Integrated into our shared values is NCR Voyix’s commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. NCR Voyix is committed to being a globally inclusive company where all people are treated fairly, recognized for their individuality, promoted based on performance and encouraged to strive to reach their full potential. We believe in understanding and respecting differences among all people. Every individual at NCR Voyix has an ongoing responsibility to respect and support a globally diverse environment. Statement to Third Party Agencies To ALL recruitment agencies: NCR Voyix only accepts resumes from agencies on the preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Voyix employees, or any NCR Voyix facility. NCR Voyix is not responsible for any fees or charges associated with unsolicited resumes “When applying for a job, please make sure to only open emails that you will receive during your application process that come from a @ncrvoyix.com email domain.” Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities Atlassian is looking for a Senior Data Engineer to join our Data Engineering team which is responsible for building our data lake, maintaining our big data pipelines / services and facilitating the movement of billions of messages each day. We work directly with the business stakeholders and plenty of platform and engineering teams to enable growth and retention strategies at Atlassian. We are looking for an open-minded, structured thinker who is passionate about building services that scale. On a typical day you will help our stakeholder teams ingest data faster into our data lake, you’ll find ways to make our data pipelines more efficient, or even come up ideas to help instigate self-serve data engineering within the company. You’ll get the opportunity to work on a AWS based data lake backed by the full suite of open source projects such as Spark and Airflow. We are a team with little legacy in our tech stack and as a result you’ll spend less time paying off technical debt and more time identifying ways to make our platform better and improve our users experience. Qualifications As a Senior Data Engineer in the DE team, you will have the opportunity to apply your strong technical experience building highly reliable services on managing and orchestrating a multi-petabyte scale data lake. You enjoy working in a fast paced environment and you are able to take vague requirements and transform them into solid solutions. You are motivated by solving challenging problems, where creativity is as crucial as your ability to write code and test cases. On Your First Day, We'll Expect You To Have A BS in Computer Science or equivalent experience At least 7+ years professional experience as a Sr. Software Engineer or Sr. Data Engineer Strong programming skills (Python, Java or Scala preferred) Experience writing SQL, structuring data, and data storage practices Experience with data modeling Knowledge of data warehousing concepts Experience building data pipelines, platforms Experience with Databricks, Spark, Hive, Airflow and other streaming technologies to process incredible volumes of streaming data Experience in modern software development practices (Agile, TDD, CICD) Strong focus on data quality and experience with internal/external tools/frameworks to automatically detect data issues, anomalies. A willingness to accept failure, learn and try again An open mind to try solutions that may seem crazy at first Experience working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like) It's Preferred That You Have Experience building self-service tooling and platforms Built and designed Kappa architecture platforms Contributed to open source projects (Ex: Operators in Airflow) Experience with Data Build Tool (DBT) Our Perks & Benefits Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit go.atlassian.com/perksandbenefits . About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai, Maharastra

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 11 The Team You will be an expert contributor and part of the Rating Organizations Data Services Product Engineering Team. This team, who has a broad and expert knowledge on Ratings organizations critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy. All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value. Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform. Responsibilities: Architect, design, and implement innovative software solutions to enhance S&P Ratings' cloud-based analytics platform. Mentor a team of engineers (as required), fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications: Bachelors degree in computer science, Information Systems, Engineering, equivalent or more is required Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development 10+ years of experience with 4+ years designing/developing enterprise products, modern tech stacks and data platforms 4+ years of hands-on experience contributing to application architecture & designs, proven software/enterprise integration design patterns and full-stack knowledge including modern distributed front end and back-end technology stacks 5+ years full stack development experience in modern web development technologies, Java/J2EE, UI frameworks like Angular, React, SQL, Oracle, NoSQL Databases like MongoDB Experience designing transactional/data warehouse/data lake and data integrations with Big data eco system leveraging AWS cloud technologies Thorough understanding of distributed computing Passionate, smart, and articulate developer Quality first mindset with a strong background and experience with developing products for a global audience at scale Excellent analytical thinking, interpersonal, oral and written communication skills with strong ability to influence both IT and business partners Superior knowledge of system architecture, object-oriented design, and design patterns. Good work ethic, self-starter, and results-oriented Excellent communication skills are essential, with strong verbal and writing proficiencies Exp. with Delta Lake systems like Databricks using AWS cloud technologies and PySpark is a plus Additional Preferred Qualifications: Experience working AWS Experience with SAFe Agile Framework Bachelor's/PG degree in Computer Science, Information Systems or equivalent. Hands-on experience contributing to application architecture & designs, proven software/enterprise integration design principles Ability to prioritize and manage work to critical project timelines in a fast-paced environment Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies Ability to train and mentor

Posted 1 week ago

Apply

3.0 - 15.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Technology & Transformation: EAD: Azure Data Engineer-Consultant/Senior Consultant/Manager Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Consultant/Senior Consultant/Manager in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Design, develop and deploy solutions using different tools, design principles and conventions. Configure robotics processes and objects using core workflow principles in an efficient way; ensure they are easily maintainable and easy to understand. Understand existing processes and facilitate change requirements as part of a structured change control process. Solve day to day issues arising while running robotics processes and provide timely resolutions. Maintain proper documentation for the solutions, test procedures and scenarios during UAT and Production phase. Coordinate with process owners and business to understand the as-is process and design the automation process flow. Desired Qualifications 3-15 Years of hands-on experience Implementing Azure Cloud data warehouses, Azure and No-SQL databases and hybrid data scenarios. Experience developing Azure Data Factory (covering Azure Functions, LogicApps, Triggers, IR), Databricks (pySpark, Scala), Stream Analytics, Event Hub & HD Insight Components Experience in working on data lake & DW solutions on Azure. Experience managing Azure DevOps pipelines (CI/CD) Experience managing source data access security, using Vault, configuring authentication and authorization, enforcing data policies and standards. UG: B. Tech /B.E. in Any Specialization . Location and way of working: Base location: Pan India This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Consultant/Senior Consultant/Manager: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Consultant/Senior Consultant/Manager across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation. Committed to creating purpose - Creating a sense of vision and purpose. Agile - Achieving high-quality results through collaboration and Team unity. Skilled at building diverse capability - Developing diverse capabilities for the future. Persuasive / Influencing - Persuading and influencing stakeholders. Collaborating - Partnering to build new solutions. Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities. Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization. Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities. Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems. Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities As a data engineer, you will have the opportunity to apply your strong technical experience building highly reliable data products. You enjoy working in an agile environment. You are able to translate raw requirements into solid solutions. You are motivated by solving challenging problems, where creativity is as crucial as your ability to write code and test cases. On a typical day you will help our partner teams ingest data faster into our data lake, you’ll find ways to make our data products more efficient, or come up with ideas to help build self-serve data engineering within the company. Then you will move on to building micro-services, architecting, designing, and promoting self serve capabilities at scale to help Atlassian grow. Qualifications On your first day, we'll expect you to have: At least 3+ years of professional experience as a software engineer or data engineer A BS in Computer Science or equivalent experience Strong programming skills (some combination of Python, Java, and Scala) Experience writing SQL, structuring data, and data storage practices Experience with data modeling Knowledge of data warehousing concepts Experienced building data pipelines and micro services Experience with Spark, Airflow and other streaming technologies to process incredible volumes of streaming data A willingness to accept failure, learn and try again An open mind to try solutions that may seem impossible at first Experience in working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like), and databricks. It's Preferred, But Not Technically Required, That You Have Experience building self-service tooling and platforms Built and designed Kappa architecture platforms A passion for building and running continuous integration pipelines. Built pipelines using Databricks and well versed with their API’s Contributed to open source projects (Ex: Operators in Airflow) Our Perks & Benefits Atlassian offers a variety of perks and benefits to support you, your family and to help you engage with your local community. Our offerings include health coverage, paid volunteer days, wellness resources, and so much more. Visit go.atlassian.com/perksandbenefits to learn more. About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less

Posted 1 week ago

Apply

0.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Position: Senior Power BI Developer Location: Chennai, Tamil Nadu Experience Required: 3 to 5 Years About the Company: Ignitho Inc. is a leading AI and data engineering company with a global presence, including US, UK, India, and Costa Rica offices. Visit our website to learn more about our work and culture: www.ignitho.com. Ignitho is a portfolio company of Nuivio Ventures Inc., a venture builder dedicated to developing Enterprise AI product companies across various domains, including AI, Data Engineering, and IoT. Learn more about Nuivio at: www.nuivio.com. Job Overview: We are seeking a highly skilled and experienced Senior Power BI Developer to join our dynamic and growing team. In this role, you will be responsible for designing, developing, and maintaining interactive reports and dashboards that empower business users to make informed, data-driven decisions. The ideal candidate will possess a deep understanding of Power BI, data modelling, and business intelligence best practices. Key Responsibilities: Design and develop robust Power BI reports and dashboards aligned with business objectives. Build complex semantic models, including composite models. Utilise DAX and Power Query to create high-performance BI solutions. Integrate data from multiple on-premises and cloud-based databases. Develop reports with advanced custom visuals and interactive elements. Implement data loading through XMLA Endpoints . Design and manage Power Automate flows for process automation. Create and maintain Paginated Reports. Integrate advanced analytics tools (e.g., Python, R ) within Power BI . Apply strong SQL skills and ETL processes for data transformation and warehousing. Follow Agile methodologies for BI solution delivery. Manage deployment pipelines and version control. Administer Power BI environments, including workspace management, security, and content sharing. Leverage Power BI Embedded or REST API for advanced integration and automation. Required Qualifications: Bachelor’s degree in computer science, Information Systems, Data Science, or a related field (or equivalent professional experience). 3 to 5 years of hands-on experience in developing Power BI reports and dashboards. Proven expertise in DAX, Power Query, and data modelling . Preferred Skills: Experience with Databricks . Familiarity with Python, R, or other data analysis tools. Power BI certification is a strong advantage. Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,500,000.00 per year Work Location: In person

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About tsworks: tsworks is a leading technology innovator, providing transformative products and services designed for the digital-first world. Our mission is to provide domain expertise, innovative solutions and thought leadership to drive exceptional user and customer experiences. Demonstrating this commitment , we have a proven track record of championing digital transformation for industries such as Banking, Travel and Hospitality, and Retail (including e-commerce and omnichannel), as well as Distribution and Supply Chain, delivering impactful solutions that drive efficiency and growth. We take pride in fostering a workplace where your skills, ideas, and attitude shape meaningful customer engagements. About This Role: tsworks Technologies India Private Limited is seeking driven and motivated Senior Data Engineers to join its Digital Services Team. You will get hands-on experience with projects employing industry-leading technologies. This would initially be focused on the operational readiness and maintenance of existing applications and would transition into a build and maintenance role in the long run. Requirements Position: Data Engineer II Experience: 3 to 10+ Years Location: Bangalore, India Mandatory Required Qualification Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Expertise in DevOps and CI/CD implementation Good knowledge in SQL Excellent Communication Skills In This Role, You Will Design, implement, and manage scalable and efficient data architecture on the Azure cloud platform. Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Perform complex data transformations and processing using Azure Data Factory, Azure Databricks, Snowflake's data processing capabilities, or other relevant tools. Develop and maintain data models within Snowflake and related tools to support reporting, analytics, and business intelligence needs. Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions. Integrate data from various sources, both internal and external, ensuring data quality and consistency. Ensure data models are designed for scalability, reusability, and flexibility. Implement data quality checks, validations, and monitoring processes to ensure data accuracy and integrity across Azure and Snowflake environments. Adhere to data governance standards and best practices to maintain data security and compliance. Handling performance optimization in ADF and Snowflake platforms Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver actionable insights Provide guidance and mentorship to junior team members to enhance their technical skills. Maintain comprehensive documentation for data pipelines, processes, and architecture within both Azure and Snowflake environments including best practices, standards, and procedures. Skills & Knowledge Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3 + Years of experience in Information Technology, designing, developing and executing solutions. 3+ Years of hands-on experience in designing and executing data solutions on Azure cloud platforms as a Data Engineer. Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Familiarity with Snowflake data platform would be an added advantage. Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required. Expertise in DevOps and CI/CD implementation. Hands-on experience with SQL and NoSQL databases. Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems. Experience with data modelling concepts and practices. Familiarity with data quality, governance, and security best practices. Knowledge of big data technologies such as Hadoop, Spark, or Kafka. Familiarity with machine learning concepts and integration of ML pipelines into data workflows Hands-on experience working in an Agile setting. Is self-driven, naturally curious, and able to adapt to a fast-paced work environment. Can articulate, create, and maintain technical and non-technical documentation. Public cloud certifications are desired. Show more Show less

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration, and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential. The Team We are seeking highly skilled Databricks Data Engineers to join our data modernization team. You will play a pivotal role in designing, developing, and maintaining robust data solutions on the Databricks platform. Your experience in data engineering, along with a deep understanding of Databricks, will be instrumental in building solutions to drive data-driven decision-making across a variety of customers. Location: Bangalore/Mumbai/Pune/Delhi/Hyderabad/Coimbatore/Kolkata/Chennai Work you’ll do Responsibilities Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices. Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions. Develop data models and schemas to support reporting and analytics needs. Ensure data quality, integrity, and security by implementing appropriate checks and controls. Monitor and optimize data processing performance, identifying, and resolving bottlenecks. Stay up to date with the latest advancements in data engineering and Databricks technologies. Qualifications Bachelor’s or master’s degree in any field 6-10 years of experience in designing, implementing, and maintaining data solutions on Databricks Experience with at least one of the popular cloud platforms – Azure, AWS or GCP Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes Knowledge of data warehousing and data modelling concepts Experience with Python or SQL Experience with Delta Lake Understanding of DevOps principles and practices Excellent problem-solving and troubleshooting skills Strong communication and teamwork skills How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Role: Grafana BI Developer Location: Remote Duration: 3 Months Experience: 5+ yrs Time Zone: 12 PM - 9:00 PM IST Dual employment = Client does not allow dual employment and such must be terminated as soon as possible if applies. Note:- There will be a BGV process for this requirement. English fluency (all teams work internationally, and English is the standard language). Candidate should be your Inhouse Bench resource Technical Requirements The ideal candidate should have deep expertise in Grafana, with the ability to design, administer, optimize, and support real-time monitoring dashboards. Should be capable of analyzing existing business logic in Power BI and effectively migrating dashboards to Grafana. (Support will be available from our Power BI developers to understand current reports.) Must have experience in integrating Grafana with various data sources, particularly PostgreSQL, with potential future integration with platforms like Databricks. A strong understanding of both Grafana Cloud and on-premises deployments is required, as there is a possibility of moving to an on-premises setup in the future. Initial contract duration: approximately 3 months, focused on a single real-time data streaming use case. Skills: communication,problem solving,sql,data warehousing,analytical skills,apis,agile methodologies,sql proficiency,reporting tools,python,problem-solving,javascript,data visualization,collaboration,team collaboration,data integration,grafana Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers Show more Show less

Posted 1 week ago

Apply

0.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Software Engineer (Backend) (SDE-1) DViO is one of the largest independent, highly awarded, digital first marketing companies with a team of 175+ people operating across India, Middle East and South East Asia. We are a full-service digital marketing agency with a focus on ROI driven marketing. We are looking for a Software Engineer (Backend) to join our team. The ideal candidate will have a strong background in software development and experience with backend technologies. We are looking for someone who is passionate about backend system design and is looking to grow in this field. Responsibilities You will be working with a team that will be responsible for developing services for various applications, like marketing automation, campaign optimization, recommendation & analytical systems, etc. The candidate will work on developing backend services, including REST APIs, data processing pipelines, and database management. Develop backend services for various business use cases Write clean, maintainable code Collaborate with other team members Improvise code based on feedback Work on bug fixes, refactoring and performance improvements Tracking technology changes and keeping our applications up-to-date Requirements Qualifications: Bachelor's degree in Computer Science, Engineering, or related field 0-1 year of experience in software development Must-have skills: Proficient in either PHP, Python, or Node.js Experience with any backend MVC frameworks like Laravel, Rails, Express, Django etc. Experience with any database like MySQL, PostgreSQL, MongoDB, etc. Experience with REST APIs, Docker, Bash and Git Good-to-have skills: Experience with WebSockets, Socket.io, etc. Experience with search technologies like Meilisearch, Typesense, Elasticsearch, etc. Experience with caching technologies like Redis, Memcached, etc. Experience with cloud platforms like AWS, GCP, Azure, etc. Experience with monolithic architecture Experience with data warehouses or data lakes like Snowflake, Amazon Redshift, Google BigQuery, Databricks, etc. Benefits DViO offers innovative and challenging work environment with the opportunity to work on cutting-edge technologies. Join us and be a part of a dynamic team that is passionate about software development and build applications that will shape the future of digital marketing. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Overview: ShyftLabs is seeking an experienced Databricks Architect to lead the design, development, and optimization of big data solutions using the Databricks Unified Analytics Platform. This role requires deep expertise in Apache Spark, SQL, Python, and cloud platforms (AWS/Azure/GCP). The ideal candidate will collaborate with cross-functional teams to architect scalable, high-performance data platforms and drive data-driven innovation. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to accelerate business growth across various industries by focusing on creating value through innovation. Job Responsibilities Architect, design, and optimize big data and AI/ML solutions on the Databricks platform. Develop and implement highly scalable ETL pipelines for processing large datasets. Lead the adoption of Apache Spark for distributed data processing and real-time analytics. Define and enforce data governance, security policies, and compliance standards. Optimize data lakehouse architectures for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineers to enable AI/ML-driven insights. Oversee and troubleshoot Databricks clusters, jobs, and performance bottlenecks. Automate data workflows using CI/CD pipelines and infrastructure-as-code practices. Ensure data integrity, quality, and reliability across all data processes. Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 10+ years of hands-on experience in data engineering, with at least 5+ years in Databricks Architect and Apache Spark. Proficiency in SQL, Python, or Scala for data processing and analytics. Extensive experience with cloud platforms (AWS, Azure, or GCP) for data engineering. Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture. Hands-on experience with CI/CD tools and DevOps best practices. Familiarity with data security, compliance, and governance best practices. Strong problem-solving and analytical skills in a fast-paced environment. Preferred Qualifications: Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer). Hands-on experience with MLflow, Feature Store, or Databricks SQL. Exposure to Kubernetes, Docker, and Terraform. Experience with streaming data architectures (Kafka, Kinesis, etc.). Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker). Prior experience working with retail, e-commerce, or ad-tech data platforms. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Responsibilities: ● Work with multiple Agile teams delivering data and analytics solutions. ● Serve as Scrum Master for teams supporting a global manufacturing enterprise. ● Collaborate with Product Owners to manage and refine backlogs aligned to business needs. ● Facilitate Agile ceremonies: Sprint Planning, Stand-ups, Reviews, Retrospectives, etc. ● Drive data-focused sprint delivery: ingestion, transformation, integration, and reporting. ● Identify and resolve blockers; champion continuous improvement and delivery velocity. ● Partner with cross-functional stakeholders: data engineers, analysts, and architects. ● Promote Agile practices across platforms like SAP ECC, IBP, HANA, BOBJ, Databricks, Tableau. ● Track Agile metrics (velocity, burndown, throughput) to improve team performance. ● Support capacity planning, sprint forecasting, and risk identification. ● Foster a high-performance culture built on adaptability, collaboration, and customer focus. ● Orient the team toward outcome-based progress: “building outcomes” vs. “completing tasks”. ● Help break down efforts into small, incremental work units for better delivery flow. ● Ensure story clarity with detailed descriptions and acceptance criteria. ● Lead daily stand-ups with a focus on “completion” over “in progress”. Must-Have Skills: ● 3–5 years of experience as a Scrum Master in Data & Analytics environments. ● Experience working with SAP, HANA, and related analytics tools/platforms. ● Strong knowledge of Agile principles beyond just the ceremonies. ● Ability to guide teams in behavior and mindset change, not just process compliance. ● Skilled in tracking sprint metrics and helping set achievable sprint goals. ● Strong organizational, interpersonal, analytical, and communication skills. ● Comfortable working with global teams and flexible across time zones. Show more Show less

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Scientist Location: Navi Mumbai Experience: 3-6 Years Duration: Fulltime Job Summary: We are looking for a highly skilled Data Scientist with deep expertise in time series forecasting, particularly in demand forecasting and customer lifecycle analytics (CLV). The ideal candidate will be proficient in Python or PySpark, have hands-on experience with tools like Prophet and ARIMA, and be comfortable working in Databricks environments. Familiarity with classic ML models and optimization techniques is a plus. Key Responsibilities • Develop, deploy, and maintain time series forecasting models (Prophet, ARIMA, etc.) for demand forecasting and customer behavior modeling. • Design and implement Customer Lifetime Value (CLV) models to drive customer retention and engagement strategies. • Process and analyze large datasets using PySpark or Python (Pandas). • Partner with cross-functional teams to identify business needs and translate them into data science solutions. • Leverage classic ML techniques (classification, regression) and boosting algorithms (e.g., XGBoost, LightGBM) to support broader analytics use cases. • Use Databricks for collaborative development, data pipelines, and model orchestration. • Apply optimization techniques where relevant to improve forecast accuracy and business decision-making. • Present actionable insights and communicate model results effectively to technical and non-technical stakeholders. Required Qualifications • Strong experience in Time Series Forecasting, with hands-on knowledge of Prophet, ARIMA, or equivalent – Mandatory. • Proven track record in Demand Forecasting – Highly Preferred. • Experience in modeling Customer Lifecycle Value (CLV) or similar customer analytics use cases – Highly Preferred. • Proficiency in Python (Pandas) or PySpark – Mandatory. • Experience with Databricks – Mandatory. • Solid foundation in statistics, predictive modeling, and machine learning Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Data Engineer Experience: 6+ Years Location: Remote Employment Type: Full Time Job Summary: We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic data engineering team. The ideal candidate will have deep expertise in C#, Azure Data Factory (ADF), Databricks, SQL Server, and Python, along with a strong understanding of modern CI/CD practices. You will be responsible for designing, developing, and maintaining scalable and efficient data pipelines and solutions to support analytics, reporting, and operational systems. Key Responsibilities: Design, develop, and optimize complex data pipelines using Azure Data Factory, Databricks, and SQL Server. Show more Show less

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Advisor will work closely in a consultative capacity with senior CoE management & global sales leadership on multiple projects related to customer & pricing analytics. Advisor is expected to provide industry recognized thought leadership at the enterprise level related to data analysis & visualization, creating the logic for and implementing strategies, providing requirements to data analysts and technology teams on data attributes, models and platform requirements, and communicating with global stakeholders to ensure we deliver the best possible customer experience. Provides regular expert consultative advice to senior leadership and champions the design and development of innovative solutions. Should possess and demonstrate understanding of core Business and Commercial concepts including financial metrics, market dynamics, and competitive landscapes. Communicates results to a broad range of audiences. Effectively uses current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management, via understanding of the business model and the information available for analysis. Grade :11 "Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date" What Your Main Responsibilities Are The Key responsibilities of this role are: Leads & guide teams that leverage most advanced descriptive & diagnostic techniques and/or other approaches in analysis of complex business situations. Work cross-functionally with teams to analyze usage and uncovering key, actionable insights Champions, develops and implements innovative solutions from initial concept to fully tested production, and communicates results to a broad range of audiences Expert use, investigation and implementation of the most current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management that will enable process transformation Designing and measuring controlled experiments to determine the potential impact of new approaches. Help with various data analysis and modelling projects Place actionable data points and trends in context for leadership to understand actual performance and uncover opportunities. Take ownership of the end-to-end system from Problem statement to Solution Delivery and leverage other teams if required Mentors less senior staff. Lead cross functional projects and programs formally preparing and presenting to management. Routinely work on multiple highly complex assignments concurrently. Provides expert consultation to Sr. Leadership routinely and present insights with strong storytelling skills What We Are Looking For Key skills needed for this role: Skills Strong financial acumen particularly of pricing models/systems, revenue & cost structures, contribution & operating margins, and P&L views Excellent stakeholder management skills particularly with team members across different regions to achieve common goals Strong communication skills to communicate with people across all levels including senior management & be able to tell logical stories by crafting solid visually appealing presentations. Excellent project management skills Strong analytical skills to deliver accurate results & actionable recommendation. Key behaviors & mindsets: Consultative mindset Innovation mindset Bias for action with a focus on transformation of legacy processes Sense of ownership Qualification Master’s degree in business, information systems, computer science, or a quantitative discipline from tier1/2 institutes. Experience requirement: 7-10 years of relevant analytics/consulting/leadership experience Tools/platforms: Oracle, SQL, Teradata, R, Python, Power BI, AbInitio, SAS, Azure Databricks FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

We’re Hiring: MLOps Engineer (Azure) harshita.panchariya@tecblic.com Location: Ahmedabad, Gujarat Experience: 3–5 Years Employment Type : Full-Time * An immediate joiner will be preferred. Job Summary: We are seeking a skilled and proactive MLOps/DataOps Engineer with strong experience in the Azure ecosystem to join our team. You will be responsible for streamlining and automating machine learning and data pipelines, supporting scalable deployment of AI/ML models, and ensuring robust monitoring, governance, and CI/CD practices across the data and ML lifecycle. Key Responsibilities MLOps : Design and implement CI/CD pipelines for machine learning workflows using Azure DevOps, GitHub Actions, or Jenkins. Automate model training, validation, deployment, and monitoring using tools such as Azure ML, MLflow, or KubeFlow. Manage model versioning, performance tracking, and rollback strategies. Integrate machine learning models with APIs or web services using Azure Functions, Azure Kubernetes Service (AKS), or Azure App Services. DataOps Design, build, and maintain scalable data ingestion, transformation, and orchestration pipelines using Azure Data Factory, Synapse Pipelines, or Apache Airflow. Ensure data quality, lineage, and governance using Azure Purview or other metadata management tools. Monitor and optimize data workflows for performance and cost efficiency. Support batch and real-time data processing using Azure Stream Analytics, Event Hubs, Databricks, or Kafka. DevOps & Infrastructure Provision and manage infrastructure using Infrastructure-as-Code tools such as Terraform, ARM Templates, or Bicep. Set up and manage compute environments (VMs, AKS, AML Compute), storage (Blob, Data Lake Gen2), and networking in Azure. Implement observability using Azure Monitor, Log Analytics, Application Insights, and Skills : Strong hands-on experience with Azure Machine Learning, Azure Data Factory, Azure DevOps, and Azure Storage solutions. Proficiency in Python, Bash, and scripting for automation. Experience with Docker, Kubernetes, and containerized deployments in Azure. Good understanding of CI/CD principles, testing strategies, and ML lifecycle management. Familiarity with monitoring, logging, and alerting in cloud environments. Knowledge of data modeling, data warehousing, and SQL. Preferred Qualifications Azure Certifications (e.g., Azure Data Engineer Associate, Azure AI Engineer Associate, or Azure DevOps Engineer Expert). Experience with Databricks, Delta Lake, or Apache Spark on Azure. Exposure to security best practices in ML and data environments (e.g., identity management, network security). Soft Skills Strong problem-solving and communication skills. Ability to work independently and collaboratively with data scientists, ML engineers, and platform teams. Passion for automation, optimization, and driving operational excellence. harshita.panchariya@tecblic.com Show more Show less

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Advisor will work closely in a consultative capacity with senior CoE management & global sales leadership on multiple projects related to customer & pricing analytics. Advisor is expected to provide industry recognized thought leadership at the enterprise level related to data analysis & visualization, creating the logic for and implementing strategies, providing requirements to data analysts and technology teams on data attributes, models and platform requirements, and communicating with global stakeholders to ensure we deliver the best possible customer experience. Provides regular expert consultative advice to senior leadership and champions the design and development of innovative solutions. Should possess and demonstrate understanding of core Business and Commercial concepts including financial metrics, market dynamics, and competitive landscapes. Communicates results to a broad range of audiences. Effectively uses current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management, via understanding of the business model and the information available for analysis. Grade :11 "Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date" What Your Main Responsibilities Are The Key responsibilities of this role are: Leads & guide teams that leverage most advanced descriptive & diagnostic techniques and/or other approaches in analysis of complex business situations. Work cross-functionally with teams to analyze usage and uncovering key, actionable insights Champions, develops and implements innovative solutions from initial concept to fully tested production, and communicates results to a broad range of audiences Expert use, investigation and implementation of the most current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management that will enable process transformation Designing and measuring controlled experiments to determine the potential impact of new approaches. Help with various data analysis and modelling projects Place actionable data points and trends in context for leadership to understand actual performance and uncover opportunities. Take ownership of the end-to-end system from Problem statement to Solution Delivery and leverage other teams if required Mentors less senior staff. Lead cross functional projects and programs formally preparing and presenting to management. Routinely work on multiple highly complex assignments concurrently. Provides expert consultation to Sr. Leadership routinely and present insights with strong storytelling skills What We Are Looking For Key skills needed for this role: Skills Strong financial acumen particularly of pricing models/systems, revenue & cost structures, contribution & operating margins, and P&L views Excellent stakeholder management skills particularly with team members across different regions to achieve common goals Strong communication skills to communicate with people across all levels including senior management & be able to tell logical stories by crafting solid visually appealing presentations. Excellent project management skills Strong analytical skills to deliver accurate results & actionable recommendation. Key behaviors & mindsets: Consultative mindset Innovation mindset Bias for action with a focus on transformation of legacy processes Sense of ownership Qualification Master’s degree in business, information systems, computer science, or a quantitative discipline from tier1/2 institutes. Experience requirement: 7-10 years of relevant analytics/consulting/leadership experience Tools/platforms: Oracle, SQL, Teradata, R, Python, Power BI, AbInitio, SAS, Azure Databricks FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Skill required: Tech for Operations - Automation Anywhere Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation,BE Years of Experience: 5 - 8 Years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? RPA Lead developer will be responsible for design & development of end-to-end RPA automation leveraging A360 tools & technologies. Should anticipate, identify, track, and resolve technical issues and risks affecting delivery. Understand the Automation Anywhere RPA platform, its features, capabilities, and best practices. You would need to be proficient in designing and implementing automation workflows that optimize business processes. What are we looking for? Minimum 5 – 8 years of strong software design & development experience Minimum 5 – 6 year(s) of programming experience in Automation Anywhere A360 , Document Automation, Co-pilot, Python. Effective GEN AI Prompts creation for Data extraction using GEN AI OCR Experience with APIs, data integration, and automation best practices Experience in VBA ,VB and Python Script programming Good Knowledge on GEN AI , Machine Learning. Should have good hands-on in core .NET concepts and OOPs Programming. Understands OO concepts and consistently applies them in client engagements. Hands on experience in SQL & T-SQL Queries, Creating complex stored procedures. Exceptional presentation, written and verbal communication skills (English) Good understanding of workflow-based logic and hands on experience using process templates, VBO design and build. Should understand process analysis and pipeline build for automation process. Automation Anywhere A360 Master/Advanced certification. Strong programming knowledge on HTML, JavaScript / VB scripts Experience with Agile development methodology. Exposure to SAP automation is preferred. Exposure to A360 Control Room features. Azure Machine Learning, Azure Databricks, and other Azure AI services. Exposure to GDPR compliance is preferred. Agile development methodologies are an added advantage. Roles and Responsibilities: Lead the team to develop automation bots and processes using A360 platform. Utilize A360 s advanced features (AARI, WLM and API Consumption, Document automation,Co-pilot) to automate complex tasks, streamline processes, and optimize efficiency. Integrate A360 with various APIs, databases, and third-party tools to ensure seamless data flow and interaction between systems. Should be able to identify and build the common components to be used across the projects. Collaborate with cross-functional teams including business analysts, Process Architects to deliver holistic automation solutions that cater to various stakeholder needs. Strong SQL database management and troubleshooting skills. Serve as a technical expert on development projects. Review code for compliance and reuse. Ensure code complies with RPA architectural industry standards. Lead problem identification/error resolution process, including tracking, repairing, and reporting defects. Creates and maintains documentation to support role responsibilities for training, cross-training, and disaster recovery. Monitor and maintain license utilization and subscriptions. Maintain / monitor RPA environments (Dev/Test/Prod) Review and ensure automation runbooks are complete and maintained. Design, develop, document, test, and debug new robotic process automation (RPA) applications for internal use. Any Graduation,BE Show more Show less

Posted 1 week ago

Apply

Exploring Databricks Jobs in India

Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect

Related Skills

In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools

Interview Questions

  • What is Databricks and how is it different from Apache Spark? (basic)
  • Explain the concept of lazy evaluation in Databricks. (medium)
  • How do you optimize performance in Databricks? (advanced)
  • What are the different cluster modes in Databricks? (basic)
  • How do you handle data skewness in Databricks? (medium)
  • Explain how you can schedule jobs in Databricks. (medium)
  • What is the significance of Delta Lake in Databricks? (advanced)
  • How do you handle schema evolution in Databricks? (medium)
  • What are the different file formats supported by Databricks for reading and writing data? (basic)
  • Explain the concept of checkpointing in Databricks. (medium)
  • How do you troubleshoot performance issues in Databricks? (advanced)
  • What are the key components of Databricks Runtime? (basic)
  • How can you secure your data in Databricks? (medium)
  • Explain the role of MLflow in Databricks. (advanced)
  • How do you handle streaming data in Databricks? (medium)
  • What is the difference between Databricks Community Edition and Databricks Workspace? (basic)
  • How do you set up monitoring and alerting in Databricks? (medium)
  • Explain the concept of Delta caching in Databricks. (advanced)
  • How do you handle schema enforcement in Databricks? (medium)
  • What are the common challenges faced in Databricks projects and how do you overcome them? (advanced)
  • How do you perform ETL operations in Databricks? (medium)
  • Explain the concept of MLflow Tracking in Databricks. (advanced)
  • How do you handle data lineage in Databricks? (medium)
  • What are the best practices for data governance in Databricks? (advanced)

Closing Remark

As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies