Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Calcutta
On-site
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Cloud Data Architecture Minimum 5 year(s) of experience is required Educational Qualification : BE or MCA Summary: As a Technology Architect, you will be responsible for reviewing and integrating all application requirements, including functional, security, integration, performance, quality, and operations requirements. Your typical day will involve reviewing and integrating technical architecture requirements, providing input into final decisions regarding hardware, network products, system software, and security, and utilizing Databricks Unified Data Analytics Platform to deliver impactful data-driven solutions. Roles & Responsibilities: 6 or more years of experience in implementing data ingestion pipelines from multiple sources and creating end to end data pipeline on Databricks platform. 2 or more years of experience using Python, PySpark or Scala. Experience in Databricks on cloud. Exp in any of AWS, Azure or GCPe, ETL, data engineering, data cleansing and insertion into a data warehouse Must have Skills like Databricks, Cloud Data Architecture, Python Programming Language, Data Engineering. Professional Attributes: Excellent writing, communication and presentation skills. Eagerness to learn and develop self on an ongoing basis. Excellent client facing and interpersonal skills. BE or MCA
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Rate indications EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 24,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Role & Responsibilities Overview Collaborate with the Rate-filing Team to analyze and estimate reserves for our P&C insurance products by state, including performing triangle-based loss reserve reviews and analyses. Analyze data and perform actuarial calculations, generate state-specific filing indications for DOI submissions (Auto & Home) Support monthly/quarterly rate updates and processes (rate files, RPC, forecasts) Assist in the development and enhancement of rate-filing tools, models, and processes to improve accuracy and efficiency. Provide support in the preparation of financial reports, including reserve-related disclosures. Stay updated with best practices in actuarial methodologies and techniques. Mentor and provide guidance to junior team members as needed. Candidate Profile Bachelor’s/Master's degree in engineering, economics, mathematics, actuarial sciences or statistics. Affiliation to IAI or IFOA, with 2-6 CT actuarial exams will be an added advantage 2-6 years Actuarial experience in the P&C insurance industry Good knowledge of insurance terms Advanced skills in Excel, Python, SQL, and other relevant tools for data analysis and modeling. Experience in Databricks is good to have Excellent analytical and problem-solving skills, with the ability to analyze complex data and make data-driven decisions. Strong communication skills, including the ability to effectively communicate actuarial concepts to both technical and non-technical stakeholders Ability to work independently and collaboratively in a team-oriented environment Detail-oriented with strong organizational and time management skills Ability to adapt to changing priorities and deadlines in a fast-paced environment What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Andhra Pradesh
On-site
Business Analytics Lead Analyst - HIH - Evernorth About Evernorth : Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Business Analytics Lead Analyst (Dashboarding) The job profile for this position is Business Analytics Lead Analyst. The Customer Experience & Operations Enablement Analytics organization offers solutions that provide data, reporting, and actionable insights to internal/external business partners to improve customer experience, reduce cost, measure business performance, and inform business decisions. The Business Analytics Lead Analyst will be responsible for dashboard and report creation as well as the ability to pull data to meet adhoc measurement needs. The individual will be able to create prototypes of reporting needs, and support manual report/scorecard creation where needed when automated dashboards are not feasible. The analytics lead analyst will be comfortable working directly with the Operations teams to learn about their process and where the data and reporting fits in. Looking for candidates that can work directly with operations team members to understand requirements and do their own development and testing. Responsibilities Include: Using SQL to write queries to answer questions and perform ETL tasks to create datasets. Utilizing Tableau or other similar Data Visualization tools to automate scorecards and reports Using Business Intelligence tools to create self-service reporting for business partners. Conducting self-driven data exploration and documentation of tables, schemas, and tests. Using SQL to query data structures to help inform our business partners. Examining and interpreting the data to discover the weaknesses and identify the root causes Completing ad hoc requests for business partners data needs. Identifying and implementing automation to consolidate similar or repeated ad hoc requests. Understanding business needs to better inform reporting and analytics duties. Giving guidance on any recurring problems or issues Completing proposals in cooperation and conjunction with experts on the subject (SME). Refactoring reporting to enhance performance, provide deeper insight, and answer questions. Updating project documents as well as status reports. Qualifications: Required experience: 5 -8 years of relevant analytics experience with focus on Proficiency with Structured Query Language (SQL) and Oracle. Experience with Business Intelligence Software (Tableau, PowerBI, Looker, etc.) 3-5 years of experience with: Scripting language (Python, Powershell, VBA). Big Data Platforms (Databricks, Hadoop, AWS). Excellent verbal, written and interpersonal communication skills a must. Problem-solving, consulting skills, teamwork, leadership, and creativity skills a must. Analytical mind with outstanding ability to collect and analyze data. Expertise in contact center or workforce planning operations preferred. Proficiency in Agile practices (Jira) preferred. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 1 week ago
15.0 years
0 Lacs
India
Remote
Company Description Simbus Technologies empowers businesses through results-driven IT consulting services with a strong focus on Kinaxis Maestro and Databricks platforms. With 15 years of expertise in Supply Chain Planning, we deliver solutions that enhance agility, boost efficiency, and enable smarter decision-making. Our strategic blend of supply chain and data capabilities—from Kinaxis Maestro implementations to advanced data engineering and AI/ML integration with Databricks—ensures seamless digital transformation. We partner with clients to innovate, streamline, and grow their operations. Role Description This is a part-time remote role for a Databricks Technical Advisor. The role involves providing technical guidance and support for Scaling Up our Databricks COE. You will be responsible for designing recruitment process, technical training and also assisting Simbus on select client engagements in the areas of Solution Architecture. The role entails an investment of approx. 10 hours per week and shall carry attractive compensation. The role can be converted to a full-time role on mutual agreement. Qualifications Strong Databricks Skills combined with Excellent Analytical Skills for data analysis and problem-solving Excellent Communication and Coaching skills to effectively interact with COE Team mates Experience with cloud data engineering and migration is beneficial Understanding of real-time analytics and business intelligence Knowledge of machine learning and AI integration Bachelor's degree in Computer Science, Information Technology, or related field Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Preferred Education Master's Degree Required Technical And Professional Expertise Able to write complex SQL queries Having experience in Azure Databricks Preferred Technical And Professional Experience Excellent communication and stakeholder management skills Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Objectives and Purpose The Senior Data Engineer ingests, builds, and supports large-scale data architectures that serve multiple downstream systems and business users. This individual supports the Data Engineer Leads and partners with Visualization on data quality and troubleshooting needs. The Senior Data Engineer will: Clean, aggregate, and organize data from disparate sources and transfer it to data warehouses. Support development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Data Engineering Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes, systems to provide accurate and available data to key stakeholders, downstream systems, & business processes. Mentor and coach staff data engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Support standardization, customization and ad hoc data analysis and develop the mechanisms to ingest, analyze, validate, normalize, and clean data. Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes. Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity. Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes. Learn about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics. Solve complex data problems to deliver insights that help achieve business objectives. Implement statistical data quality procedures on new data sources by applying rigorous iterative data analytics. Relationship Building and Collaboration Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Support Data Scientists in data sourcing and preparation to visualize data and synthesize insights of commercial value. Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity. Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Skills And Attributes For Success Technical/Functional Expertise Advanced experience and understanding of data/Big Data, data integration, data modelling, AWS, and cloud technologies. Strong business acumen with knowledge of the Pharmaceutical, Healthcare, or Life Sciences sector is preferred, but not required. Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata. Ability to build and optimize queries (SQL), data sets, 'Big Data' pipelines, and architectures for structured and unstructured data. Experience with or knowledge of Agile Software Development methodologies. Leadership Strategic mindset of thinking above the minor, tactical details and focusing on the long-term, strategic goals of the organization. Advocate of a culture of collaboration and psychological safety. Decision-making and Autonomy Shift from manual decision-making to data-driven, strategic decision-making. Proven track record of applying critical thinking to resolve issues and overcome obstacles. Interaction Proven track record of collaboration and developing strong working relationships with key stakeholders by building trust and being a true business partner. Demonstrated success in collaborating with different IT functions, contractors, and constituents to deliver data solutions that meet standards and security measures. Innovation Passion for re-imagining new solutions, processes, and end-user experience by leveraging digital and disruptive technologies and developing advanced data and analytics solutions. Advocate of a culture of growth mindset, agility, and continuous improvement. Complexity Demonstrates high multicultural sensitivity to lead teams effectively. Ability to coordinate and problem-solve amongst larger teams. To qualify for the role, you must have the following: Essential Skillsets Bachelor’s degree in Engineering, Computer Science, Data Science, or related field 5+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python) Distributed Data Technologies (e.g., Pyspark) Cloud platform deployment and tools (e.g., Kubernetes) Relational SQL databases DevOps and continuous integration AWS cloud services and technologies (i.e., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS) Databricks/ETL IICS/DMS GitHub Event Bridge, Tidal Understanding of database architecture and administration Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Desired Skillsets Masters Degree in Engineering, Computer Science, Data Science, or related field Experience in a global working environment Travel Requirements Access to transportation to attend meetings Ability to fly to meetings regionally and globally EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Preferred Education Non-Degree Program Required Technical And Professional Expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred Technical And Professional Experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Objectives and Purpose The Senior Data Engineer ingests, builds, and supports large-scale data architectures that serve multiple downstream systems and business users. This individual supports the Data Engineer Leads and partners with Visualization on data quality and troubleshooting needs. The Senior Data Engineer will: Clean, aggregate, and organize data from disparate sources and transfer it to data warehouses. Support development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Data Engineering Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes, systems to provide accurate and available data to key stakeholders, downstream systems, & business processes. Mentor and coach staff data engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Support standardization, customization and ad hoc data analysis and develop the mechanisms to ingest, analyze, validate, normalize, and clean data. Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes. Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity. Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes. Learn about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics. Solve complex data problems to deliver insights that help achieve business objectives. Implement statistical data quality procedures on new data sources by applying rigorous iterative data analytics. Relationship Building and Collaboration Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Support Data Scientists in data sourcing and preparation to visualize data and synthesize insights of commercial value. Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity. Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Skills And Attributes For Success Technical/Functional Expertise Advanced experience and understanding of data/Big Data, data integration, data modelling, AWS, and cloud technologies. Strong business acumen with knowledge of the Pharmaceutical, Healthcare, or Life Sciences sector is preferred, but not required. Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata. Ability to build and optimize queries (SQL), data sets, 'Big Data' pipelines, and architectures for structured and unstructured data. Experience with or knowledge of Agile Software Development methodologies. Leadership Strategic mindset of thinking above the minor, tactical details and focusing on the long-term, strategic goals of the organization. Advocate of a culture of collaboration and psychological safety. Decision-making and Autonomy Shift from manual decision-making to data-driven, strategic decision-making. Proven track record of applying critical thinking to resolve issues and overcome obstacles. Interaction Proven track record of collaboration and developing strong working relationships with key stakeholders by building trust and being a true business partner. Demonstrated success in collaborating with different IT functions, contractors, and constituents to deliver data solutions that meet standards and security measures. Innovation Passion for re-imagining new solutions, processes, and end-user experience by leveraging digital and disruptive technologies and developing advanced data and analytics solutions. Advocate of a culture of growth mindset, agility, and continuous improvement. Complexity Demonstrates high multicultural sensitivity to lead teams effectively. Ability to coordinate and problem-solve amongst larger teams. To qualify for the role, you must have the following: Essential Skillsets Bachelor’s degree in Engineering, Computer Science, Data Science, or related field 5+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python) Distributed Data Technologies (e.g., Pyspark) Cloud platform deployment and tools (e.g., Kubernetes) Relational SQL databases DevOps and continuous integration AWS cloud services and technologies (i.e., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS) Databricks/ETL IICS/DMS GitHub Event Bridge, Tidal Understanding of database architecture and administration Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Desired Skillsets Masters Degree in Engineering, Computer Science, Data Science, or related field Experience in a global working environment Travel Requirements Access to transportation to attend meetings Ability to fly to meetings regionally and globally EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
8.0 - 13.0 years
10 - 15 Lacs
Pune
Work from Office
We are looking for a highly skilled and experienced Data Engineering Manager to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Solution Architecture & Review Expertise in conceptualizing solution architecture and low-level design in a range of data engineering (Matillion, Informatica, Talend, Python, dbt, Airflow, Apache Spark, Databricks, Redshift) and cloud hosting (AWS, Azure) technologies Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Should have expertise and 8+ years of working experience in at least two ETL tools among Matillion, dbt, pyspark, Informatica, and Talend Should have expertise and working experience in at least two databases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware of techniques such as: Data Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Hands-on in scrum methodology (Sprint planning, execution and retrospection) Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Agile PySpark Data Modelling Designing technical architecture AWS Data Pipeline
Posted 1 week ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Objectives and Purpose The Senior Data Engineer ingests, builds, and supports large-scale data architectures that serve multiple downstream systems and business users. This individual supports the Data Engineer Leads and partners with Visualization on data quality and troubleshooting needs. The Senior Data Engineer will: Clean, aggregate, and organize data from disparate sources and transfer it to data warehouses. Support development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Data Engineering Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes, systems to provide accurate and available data to key stakeholders, downstream systems, & business processes. Mentor and coach staff data engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Support standardization, customization and ad hoc data analysis and develop the mechanisms to ingest, analyze, validate, normalize, and clean data. Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes. Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity. Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes. Learn about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics. Solve complex data problems to deliver insights that help achieve business objectives. Implement statistical data quality procedures on new data sources by applying rigorous iterative data analytics. Relationship Building and Collaboration Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Support Data Scientists in data sourcing and preparation to visualize data and synthesize insights of commercial value. Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity. Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Skills And Attributes For Success Technical/Functional Expertise Advanced experience and understanding of data/Big Data, data integration, data modelling, AWS, and cloud technologies. Strong business acumen with knowledge of the Pharmaceutical, Healthcare, or Life Sciences sector is preferred, but not required. Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata. Ability to build and optimize queries (SQL), data sets, 'Big Data' pipelines, and architectures for structured and unstructured data. Experience with or knowledge of Agile Software Development methodologies. Leadership Strategic mindset of thinking above the minor, tactical details and focusing on the long-term, strategic goals of the organization. Advocate of a culture of collaboration and psychological safety. Decision-making and Autonomy Shift from manual decision-making to data-driven, strategic decision-making. Proven track record of applying critical thinking to resolve issues and overcome obstacles. Interaction Proven track record of collaboration and developing strong working relationships with key stakeholders by building trust and being a true business partner. Demonstrated success in collaborating with different IT functions, contractors, and constituents to deliver data solutions that meet standards and security measures. Innovation Passion for re-imagining new solutions, processes, and end-user experience by leveraging digital and disruptive technologies and developing advanced data and analytics solutions. Advocate of a culture of growth mindset, agility, and continuous improvement. Complexity Demonstrates high multicultural sensitivity to lead teams effectively. Ability to coordinate and problem-solve amongst larger teams. To qualify for the role, you must have the following: Essential Skillsets Bachelor’s degree in Engineering, Computer Science, Data Science, or related field 5+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python) Distributed Data Technologies (e.g., Pyspark) Cloud platform deployment and tools (e.g., Kubernetes) Relational SQL databases DevOps and continuous integration AWS cloud services and technologies (i.e., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS) Databricks/ETL IICS/DMS GitHub Event Bridge, Tidal Understanding of database architecture and administration Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Desired Skillsets Masters Degree in Engineering, Computer Science, Data Science, or related field Experience in a global working environment Travel Requirements Access to transportation to attend meetings Ability to fly to meetings regionally and globally EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Family Data Science & Analysis (India) Travel Required None Clearance Required None What You Will Do Design, develop, and maintain robust, scalable, and efficient data pipelines and ETL/ELT processes. Lead and execute data engineering projects from inception to completion, ensuring timely delivery and high quality. Build and optimize data architectures for operational and analytical purposes. Collaborate with cross-functional teams to gather and define data requirements. Implement data quality, data governance, and data security practices. Manage and optimize cloud-based data platforms (Azure\AWS). Develop and maintain Python/PySpark libraries for data ingestion, Processing and integration with both internal and external data sources. Design and optimize scalable data pipelines using Azure data factory and Spark(Databricks) Work with stakeholders, including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Develop frameworks for data ingestion, transformation, and validation. Mentor junior data engineers and guide best practices in data engineering. Evaluate and integrate new technologies and tools to improve data infrastructure. Ensure compliance with data privacy regulations (HIPAA, etc.). Monitor performance and troubleshoot issues across the data ecosystem. Automated deployment of data pipelines using GIT hub actions \ Azure devops What You Will Need Bachelors or master’s degree in computer science, Information Systems, Statistics, Math, Engineering, or related discipline. Minimum 5 + years of solid hands-on experience in data engineering and cloud services. Extensive working experience with advanced SQL and deep understanding of SQL. Good Experience in Azure data factory (ADF), Databricks , Python and PySpark. Good experience in modern data storage concepts data lake, lake house. Experience in other cloud services (AWS) and data processing technologies will be added advantage. Ability to enhance , develop and resolve defects in ETL process using cloud services. Experience handling large volumes (multiple terabytes) of incoming data from clients and 3rd party sources in various formats such as text, csv, EDI X12 files and access database. Experience with software development methodologies (Agile, Waterfall) and version control tools Highly motivated, strong problem solver, self-starter, and fast learner with demonstrated analytic and quantitative skills. Good communication skill. What Would Be Nice To Have AWS ETL Platform – Glue , S3 One or more programming languages such as Java, .Net Experience in US health care domain and insurance claim processing. What We Offer Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. About Guidehouse Guidehouse is an Equal Opportunity Employer–Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at 1-571-633-1711 or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or guidehouse@myworkday.com. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse’s Ethics Hotline. If you want to check the validity of correspondence you have received, please contact recruiting@guidehouse.com. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant’s dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee. Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary Job Title: Senior Data Scientist/Team Lead Job Summary: We are seeking a Senior Data Scientist with hand-on experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and lead a team to implement data-driven solutions. Key Responsibilities: Lead and deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Lead a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field Led a 3-5 member team on multiple end to end DS/ML projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300022 Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary Job Title: Senior Data Scientist/Team Lead Job Summary: We are seeking a Senior Data Scientist with hand-on experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and lead a team to implement data-driven solutions. Key Responsibilities: Lead and deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Lead a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field Led a 3-5 member team on multiple end to end DS/ML projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300022 Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a Data Solution Architect (Azure; Databricks) . In this role, you will leverage your skills in artificial intelligence and machine learning to design robust data analytics solutions. If you are ready to make an impact, apply today! Responsibilities Design data analytics solutions utilizing the big data technology stack Create and present solution architecture documents with technical details Collaborate with business stakeholders to identify solution requirements and key scenarios Conduct solution architecture reviews and audits while calculating and presenting ROI Lead implementation of solutions from establishing project requirements to go-live Engage in pre-sale activities including customer communications and RFP processing Develop proposals and design solutions while presenting architecture to customers Create and follow a personal education plan in technology stack and solution architecture Maintain knowledge of industry trends and best practices Engage new clients to drive business growth in the big data space Requirements Strong hands-on experience as a Big Data developer with a solid design background Experience delivering data analytics projects and architecture guidelines Experience in big data solutions on premises and in the cloud Production project experience in at least one big data technology Knowledge of batch processing frameworks like Hadoop, MapReduce, Spark, or Hive Familiarity with NoSQL databases such as Cassandra, HBase, or Kudu Understanding of Agile development methodology with emphasis on Scrum Experience in direct customer communications and pre-sales consulting Experience working within a consulting environment would be highly valuable Show more Show less
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe action Job Title: Data Scientist/Machine Learning Engg Job Summary:We are seeking a Data Scientist with experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and collaborate with a team to implement data-driven solutions. Key Responsibilities : Deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Work with a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications : 2-7 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300100 Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary Job Title: Senior Data Scientist/Team Lead Job Summary: We are seeking a Senior Data Scientist with hand-on experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and lead a team to implement data-driven solutions. Key Responsibilities: Lead and deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Lead a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field Led a 3-5 member team on multiple end to end DS/ML projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300022 Show more Show less
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe action Job Title: Data Scientist/Machine Learning Engg Job Summary:We are seeking a Data Scientist with experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and collaborate with a team to implement data-driven solutions. Key Responsibilities : Deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Work with a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications : 2-7 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300100 Show more Show less
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Rate filing EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 24,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Role & Responsibilities Overview Collaborate with the Rate-filing Team to analyze and estimate reserves for our P&C insurance products by state, including performing triangle-based loss reserve reviews and analyses. Analyze data and perform actuarial calculations to generate state-specific filing indications for DOI submissions Manage rate reviews, filings with DOIs and DOI complaints as well as Filings and objection for multiple LOBs Support monthly/quarterly rate updates and processes (rate files, RPC, forecasts) Assist in the development and enhancement of rate-filing tools, models, and processes to improve accuracy and efficiency. Prepare detailed documentation for rate review, pricing models, and state filing reports and preparing presentations Assist in analyzing, identifying and tracking new market trends, including underwriting and rate actions and proposing what actions to take Assist in audit functions as needed and ensure compliance with Data Privacy and Protection Guidelines Provide support in the preparation of financial reports, including reserve-related disclosures. Stay updated with best practices in actuarial methodologies and techniques. Mentor and provide guidance to junior team members as needed. Candidate Profile Bachelor’s/Master's degree in engineering, economics, mathematics, actuarial sciences or statistics. Affiliation to IAI or IFOA, with 2-6 CT actuarial exams will be an added advantage 3-6 years Actuarial experience in the P&C insurance industry Good knowledge of insurance terms Advanced skills in Excel, Databricks, SQL, and other relevant tools for data analysis and modeling. Excellent analytical and problem-solving skills, with the ability to analyze complex data and make data-driven decisions. Strong communication skills, including the ability to effectively communicate actuarial concepts to both technical and non-technical stakeholders. Ability to work independently and collaboratively in a team-oriented environment. Detail-oriented with strong organizational and time management skills. Ability to adapt to changing priorities and deadlines in a fast-paced environment. What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less
Posted 1 week ago
5.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Reserving Data Analyst EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 24,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Role & Responsibilities Overview Support in the creation and update of data definitions to allow for reserving data to be classed into risk-profiled subsets for actuarial analysis Coding in Databricks to encode claims datasets per their defined class codes for further analysis and reconciliation against multiple internal sources for credibility & data assurance Maintain current expense (Adjustment & Other expense) processes, while improving them to goal state levels Develop tracking tables for monitoring catastrophe claims for the client’s book of business Create and enhance templates that estimate and track key actuarial diagnostic metrics like AvE, PYD, Settlement rates etc. Support with the preparation of exhibits for regulatory filing Develop and maintain reserve models in tools like ResQ, Arius etc. Identify opportunities for seamless integration with downstream processes Assist in the development and enhancement of other actuarial tools, models, and processes to improve accuracy and efficiency. Provide support with the preparation of financial reports, recons and dashboards. Stay updated with best practices in actuarial and insurance terminology. Mentor and provide guidance to junior team members as needed. Candidate Profile Bachelor’s/Master's degree in engineering, economics, mathematics, actuarial sciences or statistics. Affiliation to IAI or IFOA, with 2-6 CT actuarial exams will be an added advantage 5-6 years Actuarial experience in the P&C insurance industry Good knowledge of insurance terms Advanced skills in Excel, Databricks, SQL, and other relevant tools for data analysis and modeling. Familiarity with reserving tools like ResQ, Arius is preferred Excellent analytical and problem-solving skills, with the ability to analyze complex data and make data-driven decisions. Strong communication skills, including the ability to effectively communicate actuarial concepts to both technical and non-technical stakeholders. Ability to work independently and collaboratively in a team-oriented environment. Detail-oriented with strong organizational and time management skills. Ability to adapt to changing priorities and deadlines in a fast-paced environment. What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Rate filing EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 24,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Role & Responsibilities Overview Collaborate with the Rate-filing Team to analyze and estimate reserves for our P&C insurance products by state, including performing triangle-based loss reserve reviews and analyses. Analyze data and perform actuarial calculations to generate state-specific filing indications for DOI submissions Manage rate reviews, filings with DOIs and DOI complaints as well as Filings and objection for multiple LOBs Support monthly/quarterly rate updates and processes (rate files, RPC, forecasts) Assist in the development and enhancement of rate-filing tools, models, and processes to improve accuracy and efficiency. Prepare detailed documentation for rate review, pricing models, and state filing reports and preparing presentations Assist in analyzing, identifying and tracking new market trends, including underwriting and rate actions and proposing what actions to take Assist in audit functions as needed and ensure compliance with Data Privacy and Protection Guidelines Provide support in the preparation of financial reports, including reserve-related disclosures. Stay updated with best practices in actuarial methodologies and techniques. Mentor and provide guidance to junior team members as needed. Candidate Profile Bachelor’s/Master's degree in engineering, economics, mathematics, actuarial sciences or statistics. Affiliation to IAI or IFOA, with 2-6 CT actuarial exams will be an added advantage 3-6 years Actuarial experience in the P&C insurance industry Good knowledge of insurance terms Advanced skills in Excel, Databricks, SQL, and other relevant tools for data analysis and modeling. Excellent analytical and problem-solving skills, with the ability to analyze complex data and make data-driven decisions. Strong communication skills, including the ability to effectively communicate actuarial concepts to both technical and non-technical stakeholders. Ability to work independently and collaboratively in a team-oriented environment. Detail-oriented with strong organizational and time management skills. Ability to adapt to changing priorities and deadlines in a fast-paced environment. What We Offer EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. Show more Show less
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe action Job Title: Data Scientist/Machine Learning Engg Job Summary:We are seeking a Data Scientist with experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and collaborate with a team to implement data-driven solutions. Key Responsibilities : Deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Work with a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications : 2-7 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300100 Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Global Data Analytics Strategy Lead Experience: 15+ Years Employment Type: Contractual Location: Hyderabad Job Description: This role will lead and deliver the Data Catalogue and Analytics Framework project across the global firm. The primary objective is to establish standardised definitions, shared understanding of data and metrics, and standardised performance measures aligned with business priorities. Additionally, the role will lead the development of a well-defined analytics framework and architecture across the firm using a unified platform. This initiative supports foundational improvements in data management, aligning with the Global Data Strategy & Architecture team’s efforts to standardise and simplify data creation, management, and usage. The role will also manage an external delivery team to ensure successful delivery on time and within budget. Key Responsibilities Project Scoping and Planning: Develop a multi-year project and budget plan for the Data Catalogue and Analytics Framework, aligned with the broader data strategy. Collaborate with senior leaders to create an analytics framework aligned with business goals. Foster engagement and collaboration across cross-functional and leadership stakeholders. Work with the Global Head of Data Strategy & Architecture to select the necessary technology platforms and delivery partners. Project Delivery: Oversee the external delivery partner to achieve the following target outcomes: Standardised metrics across regions, BUs, and teams aligned to business goals Data dictionary and taxonomies for critical data Data Relationship Map linking master and transaction data to standardized metrics Integrated analytics design for HR, Finance, and Client data Architecture for a single unified data platform and shared repository for core business domains Project & Stakeholder Management: Own and manage the delivery plan, addressing technical interdependencies and stakeholder coordination. Ensure alignment between project outcomes and broader strategic objectives. Required Qualifications & Experience: Bachelor’s degree in IT, Data Management, Business Administration, or related field (Master’s preferred). 15+ years of total experience, with a minimum of 5 years in data-related roles. Ability to link business goals with measurable metrics. Strong grasp of project and data management, analytics frameworks, and data quality principles. Experience delivering data analytics programs. Familiarity with metadata management and data catalog tools. Proficiency with tools such as Azure Cloud, Power BI, Tableau, and Databricks is advantageous. Experience with fixed-fee project delivery in collaboration with top-tier consulting/delivery partners. How to Apply: Interested candidates with the required experience and skills are encouraged to apply. Please send your resume to admin@ignituslabs.com with the subject line: "Application - Global Data Analytics Strategy Lead ". Alternatively, you can directly reach out to our hiring team. We look forward to hearing from you! Show more Show less
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the AI & Engineering (AI&E) practice, our AI & Data offering helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe action Job Title: Data Scientist/Machine Learning Engg Job Summary:We are seeking a Data Scientist with experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and collaborate with a team to implement data-driven solutions. Key Responsibilities : Deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Work with a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications : 2-7 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field. Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300100 Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience: 5+ Years Role Overview: Responsible for designing, building, and maintaining scalable data pipelines and architectures. This role requires expertise in SQL, ETL frameworks, big data technologies, cloud services, and programming languages to ensure efficient data processing, storage, and integration across systems. Requirements: • Minimum 5+ years of experience as a Data Engineer or similar data-related role. • Strong proficiency in SQL for querying databases and performing data transformations. • Experience with data pipeline frameworks (e.g., Apache Airflow, Luigi, or custom-built solutions). • Proficiency in at least one programming language such as Python, Java, or Scala for data processing tasks. • Experience with cloud-based data services and Datalakes (e.g., Snowflake, Databricks, AWS S3, GCP BigQuery, or Azure Data Lake). • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka). • Experience with ETL tools (e.g., Talend, Apache NiFi, SSIS, etc.) and data integration techniques. • Knowledge of data warehousing concepts and database design principles. • Good understanding of NoSQL and Big Data Technologies like MongoDB, Cassandra, Spark, Hadoop, Hive, • Experience with data modeling and schema design for OLAP and OLTP systems. • Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Educational Qualification: Bachelor’s/Master’s degree in computer science, Information Technology, or a related field. Show more Show less
Posted 1 week ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Senior Specialist, Data and Analytics Architect THE OPPORTUNITY Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Lead an Organization driven by digital technology and data-backed approaches that supports a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be the leaders who have a passion for using data, analytics, and insights to drive decision-making, which will allow us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. AN integral part of our company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview We are seeking a talented and motivated Technical Architect to join our Data and Analytics Strategy & Architecture team. Reporting to the Lead Architect, this mid-level Technical Architect role is critical in shaping the technical foundation of our cross-product architecture. The ideal candidate will focus on reference architecture, driving proofs of concept (POCs) and points of view (POVs), staying updated on industry trends, solving technical architecture issues, and enabling a robust data observability framework. The role will also emphasize enterprise data marketplaces and data catalogs to ensure data accessibility, governance, and usability. This position will also focus on creating a customer-centric development environment that is resilient and easily adoptable by various user personas. The outcome of the cross-product integration will be improved efficiency and productivity through accelerated provisioning times and a seamless user experience, eliminating the need for interacting with multiple platforms and teams. What Will You Do In The Role Collaborate with product line teams to design and implement cohesive architecture solutions that enable cross-product integration, spanning ingestion, governance, analytics, and visualization. Develop, maintain, and advocate for reusable reference architectures that align with organizational goals and industry standards. Lead technical POCs and POVs to evaluate new technologies, tools, and methodologies, providing actionable recommendations. Diagnose and resolve complex technical architecture issues, ensuring stability, scalability, and performance across platforms. Implement and maintain frameworks to monitor data quality, lineage, and reliability across data pipelines. Contribute to the design and implementation of an enterprise data marketplace to facilitate self-service data discovery, analytics, and consumption. Oversee and extend the use of Collibra or similar tools to enhance metadata management, data governance, and cataloging across the enterprise. Monitor emerging industry trends in data and analytics (e.g., AI/ML, data engineering, cloud platforms) and identify opportunities to incorporate them into our ecosystem. Work closely with data engineers, data scientists, and other architects to ensure alignment with the enterprise architecture strategy. Create and maintain technical documentation, including architecture diagrams, decision records, and POC/POV results. What Should You Have Strong experience with Databricks, Dataiku, Starburst and related data engineering/analytics platforms. Proficiency in AWS cloud platforms and AWS Data and Analytics technologies Knowledge of modern data architecture patterns like data Lakehouse, data mesh, or data fabric. Hands-on experience with Collibra or similar data catalog tools for metadata management and governance. Familiarity with data observability tools and frameworks to monitor data quality and reliability. Experience contributing to or implementing enterprise data marketplaces, including facilitating self-service data access and analytics. Exposure to designing and implementing scalable, distributed architectures. Proven experience in diagnosing and resolving technical issues in complex systems. Passion for exploring and implementing innovative tools and technologies in data and analytics. 3–5 years of total experience in data engineering, analytics, or architecture roles. Hands-on experience with developing ETL pipelines with DBT, Matillion and Airflow. Experience with data modeling, and data visualization tools (e.g., ThoughtSpot, Power BI). Strong communication and collaboration skills. Ability to work in a fast-paced, cross-functional environment. Focus on continuous learning and professional growth. Preferred Skills Certification in Databricks, Dataiku, or a major cloud platform. Experience with orchestration tools like Airflow or Prefect. Understanding of AI/ML workflows and platforms. Exposure to frameworks like Apache Spark or Kubernetes. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are intellectually curious, join us—and start making your impact today. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Enterprise Architecture (BEA), Business Process Modeling, Data Modeling, Emerging Technologies, Requirements Management, Solution Architecture, Stakeholder Relationship Management, Strategic Planning, System Designs Preferred Skills Job Posting End Date 07/3/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R345601 Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.
The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect
In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools
As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16951 Jobs | Dublin
Wipro
9154 Jobs | Bengaluru
EY
7414 Jobs | London
Amazon
5846 Jobs | Seattle,WA
Uplers
5736 Jobs | Ahmedabad
IBM
5617 Jobs | Armonk
Oracle
5448 Jobs | Redwood City
Accenture in India
5221 Jobs | Dublin 2
Capgemini
3420 Jobs | Paris,France
Tata Consultancy Services
3151 Jobs | Thane