Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
About This Role About Aladdin Financial Engineering (AFE): Join a diverse and collaborative team of over 300 modelers and technologists in Aladdin Financial Engineering (AFE) within BlackRock Solutions, the business responsible for the research and development of Aladdin’s financial models. This group is also accountable for analytics production, enhancing the infrastructure platform and delivering analytics content to portfolio and risk management professionals (both within BlackRock and across the Aladdin client community). The models developed and supported by AFE span a wide array of financial products covering equities, fixed income, commodities, derivatives, and private markets. AFE provides investment insights that range from an analysis of cash flows on a single bond, to the overall financial risk associated with an entire portfolio, balance sheet, or enterprise. Role Description We are looking for a person to join the Advanced Data Analytics team with AFE Single Security. Advanced Data Analytics is a team of Quantitative Data and Product Specialists, focused on delivering Single Security Data Content, Governance and Product Solutions and Research Platform. The team leverages data, cloud, and emerging technologies in building an innovative data platform, with the focus on business and research use cases in the Single Security space. The team uses various statistical/mathematical methodologies to derive insights and generate content to help develop predictive models, clustering, and classification solutions and enable Governance. The team works on Mortgage, Structured & Credit Products. We are looking for a person to help build and expand Data & Analytics Content in the Credit space. The person will be responsible for building, enhancing, and maintaining the Credit Content Suite. The person will work on the below – Credit Derived Data Content Model & Data Governance Credit Model & Analytics Experience Experience on Scala Knowledge of ETL, data curation and analytical jobs using distributed computing framework with Spark Knowledge and Experience of working with large enterprise databases like Snowflake, Cassandra & Cloud manged services like Dataproc, Databricks Knowledge of financial instruments like Corporate Bonds, Derivatives etc. Knowledge of regression methodologies Aptitude for design and building tools for Data Governance Python knowledge is a plus Qualifications Bachelors/master's in computer science with a major in Math, Econ, or related field 3-6 years of relevant experience Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Databricks Dashboard Engineer Job Summary We are looking for a versatile Databricks Dashboard Engineer with strong coding skills in SQL who can design and build interactive dashboards as well as contribute to data engineering efforts. Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, and maintain interactive dashboards and visualizations using Databricks SQL, Delta Lake, and Notebooks. Collaborate with business stakeholders to gather dashboard requirements and deliver actionable insights. Optimize data models and queries for performance and scalability. Integrate Databricks data with BI tools such as Power BI, Tableau, or Looker. Automate dashboard refreshes and monitor data quality. Maintain comprehensive documentation for dashboards. Work closely with data engineers and analysts to ensure data governance and reliability. Stay current with Databricks platform capabilities and dashboarding best practices Design, develop, test, and deploy data model and dashboard processes (batch or real-time) using tools such as Databricks, PowerBI etc. Create functional & technical documentation – e.g. data model architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data model, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data modelling and dashboarding Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: Databricks, SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 8 Years industry implementation experience with data warehousing tools such as AWS services Redshift, Synapse, Databricks, Power BI, Tableau, Qlik, Looker etc. 3+ years of experience in databricks dashboard development 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Proficient in SQL, data modeling, and query optimization Experience with Databricks SQL, Delta Lake, and notebook development. Familiarity with BI visualization tools like Power BI, Tableau, or Looker. Understanding of data warehousing, ETL/ELT pipelines, and cloud data platforms Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About The Company Veersa is a healthtech company that leverages emerging technology and data science to solve business problems in the US healthcare industry. Veersa has established a niche in serving small and medium entities in the US healthcare space through its tech frameworks, platforms, and tech accelerators. Veersa is known for providing innovative solutions using technology and data science to its client base and is the preferred innovation partner to its clients. Veersas rich technology expertise manifests in the various tech accelerators and frameworks developed in-house to assist in rapid solutions delivery and implementations. Its end-to-end data ingestion, curation, transformation, and augmentation framework has helped several clients quickly derive business insights and monetize data assets. Veersa teams work across all emerging technology areas such as AI/ML, IoT, and Blockchain and using tech stacks as MEAN, MERN, PYTHON, GoLang, ROR, and backend such as Java Springboot, NodeJs, and using databases as PostgreSQL, MS SQL, MySQL, Oracle on AWS and Azure cloud using serverless architecture. Veersa has two major business lines Veersalabs : an In-house R&D and product development platform and Veersa tech consulting : Technical solutions delivery for clients. Veersas customer base includes large US Healthcare software vendors, Pharmacy chains, Payers, providers, and Hospital chains. Though Veersas focus geography is North America, Veersa also provides product engineering expertise to a few clients in Australia and Singapore. Position : SE/ Senior Data Engineer (with SQL, Python, Airflow, Bash) About The Role We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our growing Data Engineering Team. In this critical role, you will design, architect, and develop cutting-edge multi-tenant SaaS data solutions hosted on Azure Cloud. Your work will focus on delivering robust, scalable, and high-performance data pipelines and integrations that support our enterprise provider and payer data ecosystem. This role is ideal for someone with deep experience in ETL/ELT processes, data warehousing principles, and real-time and batch data integrations. As a senior member of the team, you will also be expected to mentor and guide junior engineers, help define best practices, and contribute to the overall data strategy. We are specifically looking for someone with strong hands-on experience in SQL, Python, and ideally Airflow and Bash scripting. Key Responsibilities Architect and implement scalable data integration and data pipeline solutions using Azure cloud services. Design, develop, and maintain ETL/ELT processes, including data extraction, transformation, loading, and quality checks using tools like SQL, Python, and Airflow. Build and automate data workflows and orchestration pipelines; knowledge of Airflow or equivalent tools is a plus. Write and maintain Bash scripts for automating system tasks and managing data jobs. Collaborate with business and technical stakeholders to understand data requirements and translate them into technical solutions. Develop and manage data flows, data mappings, and data quality & validation rules across multiple tenants and systems. Implement best practices for data modeling, metadata management, and data governance. Configure, maintain, and monitor integration jobs to ensure high availability and performance. Lead code reviews, mentor data engineers, and help shape engineering culture and standards. Stay current with emerging technologies and recommend tools or processes to improve the team's effectiveness. Required Qualifications Must have B.Tech or B.E degree in Computer Science, Information Systems, or any related field. 3+ years of experience in data engineering, with a strong focus on Azure-based solutions. Proficiency in SQL and Python for data processing and pipeline development. Experience in developing and orchestrating pipelines using Airflow (preferred) and writing automation scripts using Bash. Proven experience in designing and implementing real-time and batch data integrations. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, Databricks, or similar technologies. Strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. Familiarity with data quality, metadata management, and data validation frameworks. Strong problem-solving skills and the ability to communicate complex technical concepts clearly. Preferred Qualifications Experience with multi-tenant SaaS data solutions. Background in healthcare data, especially provider and payer ecosystems. Familiarity with DevOps practices, CI/CD pipelines, and version control systems (e.g., Git). Experience mentoring and coaching other engineers in technical and architectural decision-making. (ref:hirist.tech) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Azure Data Platform & AI Description We are looking for Azure Data Platform & AI experts passionate about developing world-class training content, lab exercises, and learning experiences that help professionals master data engineering and AI on Microsoft Azure. In this role, you will design hands-on training modules, test lab manuals, and proof-of-concept scenarios that demonstrate how to build and deploy scalable AI-driven data solutions using Azure Data Platform services. Key Responsibilities Develop hands-on labs, structured training modules, and real-world scenarios focused on Azure Synapse, Data Lake, Data Factory, and Databricks. Create step-by-step lab manuals and documentation for learners to practice real-world data engineering and AI scenarios. Design proof-of-concept projects and sample datasets to simulate business challenges. Build AI-driven analytics exercises using Azure AI, Azure Machine Learning, and OpenAI services. Develop structured learning paths covering data ingestion, transformation, storage, and analytics on Azure. Create real-time streaming and event-driven data processing labs using Azure Event Hubs, Kafka, and Stream Analytics. Work with instructional designers to ensure content is engaging, learner-friendly, and aligned with industry best practices. Optimize training environments for scalability and seamless deployment. Stay updated on the latest Azure Data & AI advancements and incorporate them into learning materials. Support hackathons, AI boot camps, and data challenges by developing hands-on exercises. Required Skills & Experiences 5+ years of experience in Azure Data Platform, data engineering, and AI/ML development. Expertise in Azure Synapse Analytics, Data Lake, Data Factory, and Databricks. Strong SQL and data modeling skills for enterprise-grade data warehousing. Experience building training content, lab manuals, or structured learning materials. Proficiency in Python, PySpark, or Scala for data engineering and AI/ML applications. Hands-on experience with real-time data streaming (Event Hubs, Kafka, Stream Analytics). Strong understanding of Azure AI services, OpenAI, and Azure Machine Learning. Experience designing step-by-step guides and practical exercises for learners. Knowledge of CI/CD pipelines, infrastructure-as-code (Terraform, Bicep), and Azure DevOps. Nice To Have Skills Experience with LLM fine-tuning and prompt engineering. Exposure to Power BI, Fabric, and self-service analytics training. Experience facilitating hackathons, data challenges, or certification boot camps. Certification in Azure Data & AI (DP-203, AI-102, or we offer : Opportunity to create training content for the latest Azure Data & AI technologies. A fast-paced, innovation-driven environment focused on upskilling professionals. Flexible work location and schedule. Competitive compensation and performance-based incentives. (ref:hirist.tech) Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description : Data Engineer Location : Bengaluru Experience : 3 - 5 years About The Role We are seeking a highly skilled Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering technologies especially SQL, Databricks, Azure services and client facing experience. Key Responsibilities Technical Expertise : Hands-on experience with SQL, Databricks, pyspark, Python, Azure Cloud, and Power BI. Design, develop, and optimize pyspark workloads. Ability to write scalable, modular and reusable code in SQL, python and pyspark. Ability to communicate clearly with client stakeholders and ability to collaborate with cross-functional teams. Requirement gathering and ability to translate business requirements into technical specifications. Stakeholder Management Engage with stakeholders to gather and analyze requirements. Provide regular updates and reports on project status and progress. Ensure alignment of data solutions with business objectives. Shift Requirements Ability to work during US shift hours to coordinate with global teams. Qualifications Proficiency in SQL, Databricks, PySpark, Python, Azure Cloud and Power BI. Strong communication skills, both written and verbal. Proven ability to work effectively with global stakeholders. Strong problem-solving skills and attention to detail. (ref:hirist.tech) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Responsibilities Design, implement, and optimize big data pipelines in Databricks. Develop scalable ETL workflows to process large datasets. Leverage Apache Spark for distributed data processing and real-time analytics. Implement data governance, security policies, and compliance standards. Optimize data lakehouse architectures for performance and cost-efficiency. Collaborate with data scientists, analysts, and engineers to enable advanced AI/ML workflows. Monitor and troubleshoot Databricks clusters, jobs, and performance bottlenecks. Automate workflows using CI/CD pipelines and infrastructure-as-code practices. Ensure data integrity, quality, and reliability in all Qualifications : Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. 5+ years of hands-on experience with Databricks and Apache Spark. Proficiency in SQL, Python, or Scala for data processing and analysis. Experience with cloud platforms (AWS, Azure, or GCP) for data engineering. Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture. Experience with CI/CD tools and DevOps best practices. Familiarity with data security, compliance, and governance best practices. Strong problem-solving and analytical skills with an ability to work in a fast-paced Qualifications : Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer). Hands-on experience with MLflow, Feature Store, or Databricks SQL. Exposure to Kubernetes, Docker, and Terraform. Experience with streaming data architectures (Kafka, Kinesis, etc.). Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker). Prior experience working with retail, e-commerce, or ad-tech data platforms. (ref:hirist.tech) Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title : Data : Bangalore : 3+ : the Opportunity : We Are Urgently Looking For Experienced Data Engineers To Join Our Team At Hexamobile, Bangalore. Ideal Candidates Will Have a Strong Background In Python, PySpark, And ETL Processes, With Azure Cloud Experience Being a Strong Design, develop, and maintain scalable and efficient data pipelines using Python and PySpark. Build and optimize ETL (Extract, Transform, Load) processes to ingest, clean, transform, and load data from various sources into data warehouses and data lakes. Work with large and complex datasets, ensuring data quality, integrity, and reliability. Collaborate closely with data scientists, analysts, and other stakeholders to understand their data requirements and provide them with clean and well-structured data. Monitor and troubleshoot data pipelines, identifying and resolving issues to ensure continuous data flow. Implement data quality checks and validation processes to maintain high data accuracy. Develop and maintain comprehensive documentation for data pipelines, ETL processes, and data models. Optimize data systems and pipelines for performance, scalability, and cost-efficiency. Implement data security and governance policies and procedures. Stay up-to-date with the latest advancements in data engineering technologies and best practices. Work in an agile environment, participating in sprint planning, daily stand-ups, and code reviews. Contribute to the design and architecture of our data Skills : Python : Strong proficiency in Python programming, including experience with data manipulation libraries (e.g., Pandas, NumPy). PySpark : Extensive hands-on experience with Apache Spark using PySpark for large-scale data processing and distributed computing. ETL Processes : Deep understanding of ETL concepts, methodologies, and best practices. Proven experience in designing, developing, and implementing ETL pipelines. SQ L: Solid understanding of SQL and experience in querying, manipulating, and transforming data in relational databases. Understanding of Databases : Strong understanding of various database systems, including relational databases (e.g., PostgreSQL, MySQL, SQL Server) and potentially NoSQL databases. Version Control : Experience with version control systems, particularly Git, and platforms like GitHub or GitLab (i.e., working with branches and pull Preferred Skills : Azure Cloud Experience: Hands-on experience with Microsoft Azure cloud services, particularly data-related services such as : Azure Data Factory Azure Databricks Azure Blob Storage Azure SQL Database Azure Data Lake Storage Experience with data warehousing concepts and : Bachelor's degree in Computer Science, Engineering, or a related field. Minimum of 3 years of professional experience as a Data Engineer. Proven experience in building and maintaining data pipelines using Python and PySpark. Strong analytical and problem-solving skills. Good verbal and written communication skills. Ability to work effectively both independently and as part of a team. Must be available to join Points : Experience with other big data technologies (Hadoop, Hive, Kafka, Apache Airflow). Knowledge of data governance and data quality frameworks. Experience with CI/CD pipelines for data engineering workflows. Familiarity with data visualization tools (Power BI, Tableau). Experience with other cloud platforms (AWS, GCP). (ref:hirist.tech) Show more Show less
Posted 1 week ago
46.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities Design and develop robust ETL/ELT pipelines using Python and AWS Glue / Lambda Work with AWS services such as S3, Athena, Redshift, Glue, Step Functions, and CloudWatch Build and maintain data integration processes between internal and external data sources Optimize data pipelines for performance, scalability, and reliability Implement data quality checks and monitoring Collaborate with data analysts, engineers, and product teams to meet data requirements Maintain proper documentation and ensure best practices in data engineering Work with structured and semi-structured data formats (JSON, Parquet, Skills : 46 years of experience as a Data Engineer Strong programming skills in Python (Pandas, Boto3, PySpark) Proficient in SQL and performance tuning Hands-on experience with AWS services : S3, Glue, Lambda, Athena, Redshift, Step Functions, CloudWatch Experience working with Databricks or EMR is a plus Experience in data lake and data warehouse concepts Familiar with version control systems like Git Knowledge of CI/CD pipelines and workflow tools (Airflow is a plus) (ref:hirist.tech) Show more Show less
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Cloud and Python Developer for ML - Senior 1/2 EY GDS Consulting digital engineering, is seeking an experienced Azure Cloud and Python Developer for ML to join our Emerging Technologies team in DET, GDS. This role presents an exciting opportunity to contribute to innovative projects and be a key player in shaping our technological advancements. The opportunity We are seeking an experienced Azure Cloud and Python Developer with 6-8 years of hands-on experience in machine learning (ML) development. This role involves developing and deploying ML models on the Azure cloud platform, designing efficient data pipelines, and collaborating with data scientists and stakeholders to deliver technical solutions. Your Key Responsibilities Develop and deploy machine learning models on Azure cloud platform using Python programming language, ensuring scalability and efficiency. Design and implement scalable and efficient data pipelines for model training and inference, optimizing data processing workflows. Collaborate closely with data scientists and business stakeholders to understand requirements, translate them into technical solutions, and deliver high-quality ML solutions. Implement best practices for ML development, including version control using tools like Git, testing methodologies, and documentation to ensure reproducibility and maintainability. Design and optimize ML algorithms and data structures for performance and accuracy, leveraging Azure cloud services and Python libraries such as TensorFlow, PyTorch, or scikit-learn. Monitor and evaluate model performance, conduct experiments, and iterate on models to improve predictive accuracy and business outcomes. Work on feature engineering, data preprocessing, and feature selection techniques to enhance model performance and interpretability. Collaborate with DevOps teams to deploy ML models into production environments, ensuring seamless integration and continuous monitoring. Stay updated with the latest advancements in ML, Azure cloud services, and Python programming, and apply them to enhance ML capabilities and efficiency. Provide technical guidance and mentorship to junior developers and data scientists, fostering a culture of continuous learning and innovation. Skills And Attributes Soft Skills Bachelor's or master's degree in computer science, data science, or related field, with a strong foundation in ML algorithms, statistics, and programming concepts. Minimum 6-8 years of hands-on experience in developing and deploying ML models on Azure cloud platform using Python programming language. Expertise in designing and implementing scalable data pipelines for ML model training and inference, utilizing Azure Data Factory, Azure Databricks, or similar tools. Proficiency in Python programming language, including libraries such as TensorFlow, PyTorch, scikit-learn, pandas, and NumPy for ML model development and data manipulation. Strong understanding of ML model evaluation metrics, feature engineering techniques, and data preprocessing methods for structured and unstructured data. Experience with cloud-native technologies and services, including Azure Machine Learning, Azure Kubernetes Service (AKS), Azure Functions, and Azure Storage. Familiarity with DevOps practices, CI/CD pipelines, and containerization tools like Docker for ML model deployment and automation. Excellent problem-solving skills, analytical thinking, and attention to detail, with the ability to troubleshoot and debug complex ML algorithms and systems. Effective communication skills, both verbal and written, with the ability to explain technical concepts to non-technical stakeholders and collaborate in cross-functional teams. Proactive and self-motivated attitude, with a passion for learning new technologies and staying updated with industry trends in ML, cloud computing, and software development. Strong organizational skills and the ability to manage multiple projects, prioritize tasks, and deliver results within project timelines and specifications. Business acumen and understanding of the impact of ML solutions on business operations and decision-making processes, with a focus on delivering value and driving business outcomes. Collaboration and teamwork skills, with the ability to work effectively in a global, diverse, and distributed team environment, fostering a culture of innovation and continuous improvement.. To qualify for the role, you must have A bachelor's or master's degree in computer science, data science, or related field, along with a minimum of 6-8 years of experience in ML development and Azure cloud platform expertise. Strong communication skills and consulting experience are highly desirable for this position. Ideally, you’ll also have Analytical ability to manage complex ML projects and prioritize tasks efficiently. Experience operating independently or with minimal supervision, demonstrating strong problem-solving skills. Familiarity with other cloud platforms and technologies such as AWS, Google Cloud Platform (GCP), or Kubernetes is a plus. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Cloud and Python Developer for ML - Senior 1/2 EY GDS Consulting digital engineering, is seeking an experienced Azure Cloud and Python Developer for ML to join our Emerging Technologies team in DET, GDS. This role presents an exciting opportunity to contribute to innovative projects and be a key player in shaping our technological advancements. The opportunity We are seeking an experienced Azure Cloud and Python Developer with 6-8 years of hands-on experience in machine learning (ML) development. This role involves developing and deploying ML models on the Azure cloud platform, designing efficient data pipelines, and collaborating with data scientists and stakeholders to deliver technical solutions. Your Key Responsibilities Develop and deploy machine learning models on Azure cloud platform using Python programming language, ensuring scalability and efficiency. Design and implement scalable and efficient data pipelines for model training and inference, optimizing data processing workflows. Collaborate closely with data scientists and business stakeholders to understand requirements, translate them into technical solutions, and deliver high-quality ML solutions. Implement best practices for ML development, including version control using tools like Git, testing methodologies, and documentation to ensure reproducibility and maintainability. Design and optimize ML algorithms and data structures for performance and accuracy, leveraging Azure cloud services and Python libraries such as TensorFlow, PyTorch, or scikit-learn. Monitor and evaluate model performance, conduct experiments, and iterate on models to improve predictive accuracy and business outcomes. Work on feature engineering, data preprocessing, and feature selection techniques to enhance model performance and interpretability. Collaborate with DevOps teams to deploy ML models into production environments, ensuring seamless integration and continuous monitoring. Stay updated with the latest advancements in ML, Azure cloud services, and Python programming, and apply them to enhance ML capabilities and efficiency. Provide technical guidance and mentorship to junior developers and data scientists, fostering a culture of continuous learning and innovation. Skills And Attributes Soft Skills Bachelor's or master's degree in computer science, data science, or related field, with a strong foundation in ML algorithms, statistics, and programming concepts. Minimum 6-8 years of hands-on experience in developing and deploying ML models on Azure cloud platform using Python programming language. Expertise in designing and implementing scalable data pipelines for ML model training and inference, utilizing Azure Data Factory, Azure Databricks, or similar tools. Proficiency in Python programming language, including libraries such as TensorFlow, PyTorch, scikit-learn, pandas, and NumPy for ML model development and data manipulation. Strong understanding of ML model evaluation metrics, feature engineering techniques, and data preprocessing methods for structured and unstructured data. Experience with cloud-native technologies and services, including Azure Machine Learning, Azure Kubernetes Service (AKS), Azure Functions, and Azure Storage. Familiarity with DevOps practices, CI/CD pipelines, and containerization tools like Docker for ML model deployment and automation. Excellent problem-solving skills, analytical thinking, and attention to detail, with the ability to troubleshoot and debug complex ML algorithms and systems. Effective communication skills, both verbal and written, with the ability to explain technical concepts to non-technical stakeholders and collaborate in cross-functional teams. Proactive and self-motivated attitude, with a passion for learning new technologies and staying updated with industry trends in ML, cloud computing, and software development. Strong organizational skills and the ability to manage multiple projects, prioritize tasks, and deliver results within project timelines and specifications. Business acumen and understanding of the impact of ML solutions on business operations and decision-making processes, with a focus on delivering value and driving business outcomes. Collaboration and teamwork skills, with the ability to work effectively in a global, diverse, and distributed team environment, fostering a culture of innovation and continuous improvement.. To qualify for the role, you must have A bachelor's or master's degree in computer science, data science, or related field, along with a minimum of 6-8 years of experience in ML development and Azure cloud platform expertise. Strong communication skills and consulting experience are highly desirable for this position. Ideally, you’ll also have Analytical ability to manage complex ML projects and prioritize tasks efficiently. Experience operating independently or with minimal supervision, demonstrating strong problem-solving skills. Familiarity with other cloud platforms and technologies such as AWS, Google Cloud Platform (GCP), or Kubernetes is a plus. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Grade Level (for internal use): 09 The Role: Quality Engineer The Team: The team works in an Agile environment and adheres to all basic principles of Agile. As a Quality Engineer, you will work with a team of intelligent, ambitious, and hard-working software professionals. The team is independent in driving all decisions and responsible for the architecture, design and development of our products with high quality. The Impact Achieve Individual objectives and contribute to the achievement of team objectives. Work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors. ETL Testing from various feeds on server (Oracle, SQL, HIVE server, Databricks) using different testing strategy to ensure the data quality and data consistency, timeliness. Achieve the above intelligently and economically using QA best practices. What Is In It For You Be the part of a successful team which works on delivering top priority projects which will directly contributing to Company’s strategy. This is the place to enhance your Testing skills while adding value to the business. As an experienced member of the team, you will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities As a Quality Engineer, you are responsible for: Defining Quality Metrics: Defining quality standards and metrics for the current project/product. Working with all stake holders to ensure that the quality metrics is reviewed, closed, and agreed upon. Create a list of milestones and checkpoints and set measurable criteria to check the quality on timely basis. Defining Testing Strategies: Defining processes for test plan and several phases of testing cycle. Planning and scheduling several milestones and tasks like alpha and beta testing. Ensuring all development tasks meet quality criteria through test planning, test execution, quality assurance and issue tracking. Work closely on the deadlines of the project. Keep raising the bar and standards of all the quality processes with every project. Thinking of continuous innovation. Managing Risks: Understanding and defining areas to calculate the overall risk to the project. Creating strategies to mitigate those risks and take necessary measures to control the risks. Communicating or creating awareness to all the stake holders for the various risks Understand & review the current risks and escalate. Process Improvements: Challenge yourself continuously to move towards automation for all daily works and help others in the automation. Create milestones for yearly improvement projects and set. Work with the development team to ensure that the quality engineers get apt support like automation hooks or debug builds wherever and whenever possible. Basic Qualifications What we are looking for: Bachelor's/PG degree in Computer Science, Information Systems or equivalent. 3 to 5 years of intensive experience in Database and ETL testing. Experience in running queries, data management, managing large data sets and dealing with databases. Strong in creating SQL queries that can parse and validate business rules/calculations. Experience in writing complex SQL Scripts, Stored Procedures, Integration packages. Experience in tuning and improving DB performance of complex enterprise class applications. Develop comprehensive test strategy, test plan and test cases to test big data implementation. Proficient with software development lifecycle (SDLC) methodologies like Agile, QA methodologies, defect management system, and documentation. Good at setting Quality standards in various new testing technologies in the industry. Good at identifying and defining areas to calculate the overall risk to the project and creating strategies to mitigate those risks and escalate as necessary. Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies. Preferred Qualifications Strong in ETL and Big Data Testing Proficiency in SQL About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316170 Posted On: 2025-06-10 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Cloud and Python Developer for ML - Senior 1/2 EY GDS Consulting digital engineering, is seeking an experienced Azure Cloud and Python Developer for ML to join our Emerging Technologies team in DET, GDS. This role presents an exciting opportunity to contribute to innovative projects and be a key player in shaping our technological advancements. The opportunity We are seeking an experienced Azure Cloud and Python Developer with 6-8 years of hands-on experience in machine learning (ML) development. This role involves developing and deploying ML models on the Azure cloud platform, designing efficient data pipelines, and collaborating with data scientists and stakeholders to deliver technical solutions. Your Key Responsibilities Develop and deploy machine learning models on Azure cloud platform using Python programming language, ensuring scalability and efficiency. Design and implement scalable and efficient data pipelines for model training and inference, optimizing data processing workflows. Collaborate closely with data scientists and business stakeholders to understand requirements, translate them into technical solutions, and deliver high-quality ML solutions. Implement best practices for ML development, including version control using tools like Git, testing methodologies, and documentation to ensure reproducibility and maintainability. Design and optimize ML algorithms and data structures for performance and accuracy, leveraging Azure cloud services and Python libraries such as TensorFlow, PyTorch, or scikit-learn. Monitor and evaluate model performance, conduct experiments, and iterate on models to improve predictive accuracy and business outcomes. Work on feature engineering, data preprocessing, and feature selection techniques to enhance model performance and interpretability. Collaborate with DevOps teams to deploy ML models into production environments, ensuring seamless integration and continuous monitoring. Stay updated with the latest advancements in ML, Azure cloud services, and Python programming, and apply them to enhance ML capabilities and efficiency. Provide technical guidance and mentorship to junior developers and data scientists, fostering a culture of continuous learning and innovation. Skills And Attributes Soft Skills Bachelor's or master's degree in computer science, data science, or related field, with a strong foundation in ML algorithms, statistics, and programming concepts. Minimum 6-8 years of hands-on experience in developing and deploying ML models on Azure cloud platform using Python programming language. Expertise in designing and implementing scalable data pipelines for ML model training and inference, utilizing Azure Data Factory, Azure Databricks, or similar tools. Proficiency in Python programming language, including libraries such as TensorFlow, PyTorch, scikit-learn, pandas, and NumPy for ML model development and data manipulation. Strong understanding of ML model evaluation metrics, feature engineering techniques, and data preprocessing methods for structured and unstructured data. Experience with cloud-native technologies and services, including Azure Machine Learning, Azure Kubernetes Service (AKS), Azure Functions, and Azure Storage. Familiarity with DevOps practices, CI/CD pipelines, and containerization tools like Docker for ML model deployment and automation. Excellent problem-solving skills, analytical thinking, and attention to detail, with the ability to troubleshoot and debug complex ML algorithms and systems. Effective communication skills, both verbal and written, with the ability to explain technical concepts to non-technical stakeholders and collaborate in cross-functional teams. Proactive and self-motivated attitude, with a passion for learning new technologies and staying updated with industry trends in ML, cloud computing, and software development. Strong organizational skills and the ability to manage multiple projects, prioritize tasks, and deliver results within project timelines and specifications. Business acumen and understanding of the impact of ML solutions on business operations and decision-making processes, with a focus on delivering value and driving business outcomes. Collaboration and teamwork skills, with the ability to work effectively in a global, diverse, and distributed team environment, fostering a culture of innovation and continuous improvement.. To qualify for the role, you must have A bachelor's or master's degree in computer science, data science, or related field, along with a minimum of 6-8 years of experience in ML development and Azure cloud platform expertise. Strong communication skills and consulting experience are highly desirable for this position. Ideally, you’ll also have Analytical ability to manage complex ML projects and prioritize tasks efficiently. Experience operating independently or with minimal supervision, demonstrating strong problem-solving skills. Familiarity with other cloud platforms and technologies such as AWS, Google Cloud Platform (GCP), or Kubernetes is a plus. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary: As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities: - Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. - Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. - Develop and maintain data pipelines using Databricks Unified Data Analytics Platform. - Design and implement data security and access controls using Databricks Unified Data Analytics Platform. - Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with other big data technologies such as Hadoop, Spark, and Kafka. - Strong understanding of data modeling and database design principles. - Experience with data security and access controls. - Experience with data pipeline development and maintenance. - Experience with troubleshooting and resolving issues related to data platform components. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. - This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices. - Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist with the data platform blueprint and design. - Collaborate with Integration Architects and Data Architects. - Ensure cohesive integration between systems and data models. - Implement data platform components. - Troubleshoot and resolve data platform issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bengaluru office. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary: As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities: - Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. - Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. - Develop and maintain data pipelines using Databricks Unified Data Analytics Platform. - Design and implement data security and access controls using Databricks Unified Data Analytics Platform. - Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform. - Must To Have Skills: Strong understanding of data modeling and database design principles. - Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure. - Good To Have Skills: Experience with data security and access controls. - Good To Have Skills: Experience with data pipeline development and maintenance. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. - This position is based at our Chennai office. Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Assist with the data platform blueprint and design. - Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. - Develop and maintain data platform components. - Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Chennai office. - 15 years full time education is required. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with data integration tools and ETL processes. - Strong understanding of application development methodologies. - Familiarity with cloud computing concepts and services. - Experience in troubleshooting and optimizing application performance. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Chennai office. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Cloud and Python Developer for ML - Senior 1/2 EY GDS Consulting digital engineering, is seeking an experienced Azure Cloud and Python Developer for ML to join our Emerging Technologies team in DET, GDS. This role presents an exciting opportunity to contribute to innovative projects and be a key player in shaping our technological advancements. The opportunity We are seeking an experienced Azure Cloud and Python Developer with 6-8 years of hands-on experience in machine learning (ML) development. This role involves developing and deploying ML models on the Azure cloud platform, designing efficient data pipelines, and collaborating with data scientists and stakeholders to deliver technical solutions. Your Key Responsibilities Develop and deploy machine learning models on Azure cloud platform using Python programming language, ensuring scalability and efficiency. Design and implement scalable and efficient data pipelines for model training and inference, optimizing data processing workflows. Collaborate closely with data scientists and business stakeholders to understand requirements, translate them into technical solutions, and deliver high-quality ML solutions. Implement best practices for ML development, including version control using tools like Git, testing methodologies, and documentation to ensure reproducibility and maintainability. Design and optimize ML algorithms and data structures for performance and accuracy, leveraging Azure cloud services and Python libraries such as TensorFlow, PyTorch, or scikit-learn. Monitor and evaluate model performance, conduct experiments, and iterate on models to improve predictive accuracy and business outcomes. Work on feature engineering, data preprocessing, and feature selection techniques to enhance model performance and interpretability. Collaborate with DevOps teams to deploy ML models into production environments, ensuring seamless integration and continuous monitoring. Stay updated with the latest advancements in ML, Azure cloud services, and Python programming, and apply them to enhance ML capabilities and efficiency. Provide technical guidance and mentorship to junior developers and data scientists, fostering a culture of continuous learning and innovation. Skills And Attributes Soft Skills Bachelor's or master's degree in computer science, data science, or related field, with a strong foundation in ML algorithms, statistics, and programming concepts. Minimum 6-8 years of hands-on experience in developing and deploying ML models on Azure cloud platform using Python programming language. Expertise in designing and implementing scalable data pipelines for ML model training and inference, utilizing Azure Data Factory, Azure Databricks, or similar tools. Proficiency in Python programming language, including libraries such as TensorFlow, PyTorch, scikit-learn, pandas, and NumPy for ML model development and data manipulation. Strong understanding of ML model evaluation metrics, feature engineering techniques, and data preprocessing methods for structured and unstructured data. Experience with cloud-native technologies and services, including Azure Machine Learning, Azure Kubernetes Service (AKS), Azure Functions, and Azure Storage. Familiarity with DevOps practices, CI/CD pipelines, and containerization tools like Docker for ML model deployment and automation. Excellent problem-solving skills, analytical thinking, and attention to detail, with the ability to troubleshoot and debug complex ML algorithms and systems. Effective communication skills, both verbal and written, with the ability to explain technical concepts to non-technical stakeholders and collaborate in cross-functional teams. Proactive and self-motivated attitude, with a passion for learning new technologies and staying updated with industry trends in ML, cloud computing, and software development. Strong organizational skills and the ability to manage multiple projects, prioritize tasks, and deliver results within project timelines and specifications. Business acumen and understanding of the impact of ML solutions on business operations and decision-making processes, with a focus on delivering value and driving business outcomes. Collaboration and teamwork skills, with the ability to work effectively in a global, diverse, and distributed team environment, fostering a culture of innovation and continuous improvement.. To qualify for the role, you must have A bachelor's or master's degree in computer science, data science, or related field, along with a minimum of 6-8 years of experience in ML development and Azure cloud platform expertise. Strong communication skills and consulting experience are highly desirable for this position. Ideally, you’ll also have Analytical ability to manage complex ML projects and prioritize tasks efficiently. Experience operating independently or with minimal supervision, demonstrating strong problem-solving skills. Familiarity with other cloud platforms and technologies such as AWS, Google Cloud Platform (GCP), or Kubernetes is a plus. What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 09 The Role: Quality Engineer The Team: The team works in an Agile environment and adheres to all basic principles of Agile. As a Quality Engineer, you will work with a team of intelligent, ambitious, and hard-working software professionals. The team is independent in driving all decisions and responsible for the architecture, design and development of our products with high quality. The Impact Achieve Individual objectives and contribute to the achievement of team objectives. Work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors. ETL Testing from various feeds on server (Oracle, SQL, HIVE server, Databricks) using different testing strategy to ensure the data quality and data consistency, timeliness. Achieve the above intelligently and economically using QA best practices. What Is In It For You Be the part of a successful team which works on delivering top priority projects which will directly contributing to Company’s strategy. This is the place to enhance your Testing skills while adding value to the business. As an experienced member of the team, you will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities As a Quality Engineer, you are responsible for: Defining Quality Metrics: Defining quality standards and metrics for the current project/product. Working with all stake holders to ensure that the quality metrics is reviewed, closed, and agreed upon. Create a list of milestones and checkpoints and set measurable criteria to check the quality on timely basis. Defining Testing Strategies: Defining processes for test plan and several phases of testing cycle. Planning and scheduling several milestones and tasks like alpha and beta testing. Ensuring all development tasks meet quality criteria through test planning, test execution, quality assurance and issue tracking. Work closely on the deadlines of the project. Keep raising the bar and standards of all the quality processes with every project. Thinking of continuous innovation. Managing Risks: Understanding and defining areas to calculate the overall risk to the project. Creating strategies to mitigate those risks and take necessary measures to control the risks. Communicating or creating awareness to all the stake holders for the various risks Understand & review the current risks and escalate. Process Improvements: Challenge yourself continuously to move towards automation for all daily works and help others in the automation. Create milestones for yearly improvement projects and set. Work with the development team to ensure that the quality engineers get apt support like automation hooks or debug builds wherever and whenever possible. Basic Qualifications What we are looking for: Bachelor's/PG degree in Computer Science, Information Systems or equivalent. 3 to 5 years of intensive experience in Database and ETL testing. Experience in running queries, data management, managing large data sets and dealing with databases. Strong in creating SQL queries that can parse and validate business rules/calculations. Experience in writing complex SQL Scripts, Stored Procedures, Integration packages. Experience in tuning and improving DB performance of complex enterprise class applications. Develop comprehensive test strategy, test plan and test cases to test big data implementation. Proficient with software development lifecycle (SDLC) methodologies like Agile, QA methodologies, defect management system, and documentation. Good at setting Quality standards in various new testing technologies in the industry. Good at identifying and defining areas to calculate the overall risk to the project and creating strategies to mitigate those risks and escalate as necessary. Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies. Preferred Qualifications Strong in ETL and Big Data Testing Proficiency in SQL About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316170 Posted On: 2025-06-10 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 09 The Role: Quality Engineer The Team: The team works in an Agile environment and adheres to all basic principles of Agile. As a Quality Engineer, you will work with a team of intelligent, ambitious, and hard-working software professionals. The team is independent in driving all decisions and responsible for the architecture, design and development of our products with high quality. The Impact Achieve Individual objectives and contribute to the achievement of team objectives. Work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors. ETL Testing from various feeds on server (Oracle, SQL, HIVE server, Databricks) using different testing strategy to ensure the data quality and data consistency, timeliness. Achieve the above intelligently and economically using QA best practices. What Is In It For You Be the part of a successful team which works on delivering top priority projects which will directly contributing to Company’s strategy. This is the place to enhance your Testing skills while adding value to the business. As an experienced member of the team, you will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities As a Quality Engineer, you are responsible for: Defining Quality Metrics: Defining quality standards and metrics for the current project/product. Working with all stake holders to ensure that the quality metrics is reviewed, closed, and agreed upon. Create a list of milestones and checkpoints and set measurable criteria to check the quality on timely basis. Defining Testing Strategies: Defining processes for test plan and several phases of testing cycle. Planning and scheduling several milestones and tasks like alpha and beta testing. Ensuring all development tasks meet quality criteria through test planning, test execution, quality assurance and issue tracking. Work closely on the deadlines of the project. Keep raising the bar and standards of all the quality processes with every project. Thinking of continuous innovation. Managing Risks: Understanding and defining areas to calculate the overall risk to the project. Creating strategies to mitigate those risks and take necessary measures to control the risks. Communicating or creating awareness to all the stake holders for the various risks Understand & review the current risks and escalate. Process Improvements: Challenge yourself continuously to move towards automation for all daily works and help others in the automation. Create milestones for yearly improvement projects and set. Work with the development team to ensure that the quality engineers get apt support like automation hooks or debug builds wherever and whenever possible. Basic Qualifications What we are looking for: Bachelor's/PG degree in Computer Science, Information Systems or equivalent. 3 to 5 years of intensive experience in Database and ETL testing. Experience in running queries, data management, managing large data sets and dealing with databases. Strong in creating SQL queries that can parse and validate business rules/calculations. Experience in writing complex SQL Scripts, Stored Procedures, Integration packages. Experience in tuning and improving DB performance of complex enterprise class applications. Develop comprehensive test strategy, test plan and test cases to test big data implementation. Proficient with software development lifecycle (SDLC) methodologies like Agile, QA methodologies, defect management system, and documentation. Good at setting Quality standards in various new testing technologies in the industry. Good at identifying and defining areas to calculate the overall risk to the project and creating strategies to mitigate those risks and escalate as necessary. Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies. Preferred Qualifications Strong in ETL and Big Data Testing Proficiency in SQL About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316170 Posted On: 2025-06-10 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and effective applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, designing application architecture, coding and testing applications, and ensuring their successful deployment and maintenance. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Design, build, and configure applications based on business process and application requirements. - Analyze business requirements and translate them into technical specifications. - Collaborate with cross-functional teams to ensure the successful implementation of applications. - Code and test applications to ensure their functionality and performance. - Ensure the efficient deployment and maintenance of applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist with the data platform blueprint and design. - Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. - Work on data platform components. - Participate in team discussions to provide valuable insights. - Contribute to solving work-related problems. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with data integration tools. - Strong understanding of data platform components and architecture. - Experience in designing and implementing data models. - Knowledge of data integration best practices. - Familiarity with data governance and security. - Hands-on experience with data platform implementation. - Ability to troubleshoot and resolve data platform issues. Additional Information: - The candidate should have a minimum of 2 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Chennai office. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
About Gartner IT Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Data warehousing engineer with technical expertise, capable of collaborating with the team to create a Data Platform Strategy and implement the solution. What You’ll Do Participate in design and implementation of the Data Warehousing Solution Participate in the end-to-end delivery of solutions from gathering requirements, to implementation, testing, and continuous improvement post roll out using Agile Scrum methodologies. What You’ll Need 2-4 years of experience in software programming and/or data warehousing, in an Agile Scrum environment. Must Have Strong experience in SQL, ADF and Synapse/Databricks. ETL process design including techniques for addressing slowly changing dimensions, differential fact-journaling (i.e., storage optimization for fact data), semi-additive measures and related concerns, and rolldown distributions. SQL query optimization Who You Are Bachelor’s degree in computer science or information systems, or equivalent experience in the field of software development Effective time management skills and ability to meet deadlines. Delivering project work on-time within budget with high quality. Excellent communications skills interacting with technical and business audience’s. Excellent organization, multitasking, and prioritization skills. Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:99949 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less
Posted 1 week ago
6.0 - 11.0 years
8 - 18 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Role & responsibilities Data Engineer, Expertise in AWS, Databricks and Pyspark
Posted 1 week ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management Expertise around Data : Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes. File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.
The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect
In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools
As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16951 Jobs | Dublin
Wipro
9154 Jobs | Bengaluru
EY
7414 Jobs | London
Amazon
5846 Jobs | Seattle,WA
Uplers
5736 Jobs | Ahmedabad
IBM
5617 Jobs | Armonk
Oracle
5448 Jobs | Redwood City
Accenture in India
5221 Jobs | Dublin 2
Capgemini
3420 Jobs | Paris,France
Tata Consultancy Services
3151 Jobs | Thane