Home
Jobs

7524 Spark Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

9 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Description : As a Data Engineer for our Large Language Model Project, you will play a crucial role in designing, implementing, and maintaining the data infrastructure. Your expertise will be instrumental in ensuring the efficient flow of data, enabling seamless integration with various components, and optimizing data processing pipelines. 5+ years of relevant experience in data engineering roles. Key Responsibilities : Data Pipeline Development - Design, develop, and maintain scalable and efficient data pipelines to support the training and deployment of large language models. Implement ETL processes to extract, transform, and load diverse datasets into suitable formats for model training. Data Integration - Collaborate with cross-functional teams, including data scientists and software engineers, to integrate data sources and ensure the availability of relevant and high-quality data. Implement solutions for real-time data processing and integration, fostering model development agility. Data Quality Assurance - Establish and maintain robust data quality checks and validation processes to ensure the accuracy and consistency of datasets. Troubleshoot data quality issues, identify root causes, and implement corrective measures. Infrastructure Management - Work closely with DevOps and IT teams to manage and optimize the data storage infrastructure, ensuring scalability and performance. Implement best practices for data security, access control, and compliance with data governance policies. Performance Optimization - Identify bottlenecks and inefficiencies in data processing pipelines and implement optimizations to enhance overall system performance. Continuously monitor and evaluate system performance metrics, making proactive adjustments as needed. Skills & Tools Programming Languages - Proficiency in languages such as Python for building robust data processing applications. Big Data Technologies - Experience with distributed computing frameworks like Apache Spark, Databricks & DBT for large-scale data processing. Database Systems - In-depth knowledge of both relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., Vector databases, MongoDB, Cassandra etc). Data Warehousing - Familiarity with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools - Hands-on experience with ETL tools like Apache NiFi, Talend, or Apache Airflow. Knowledge of NLP will be an added advantage. Cloud Services - Experience with cloud platforms like AWS, Azure, or Google Cloud for deploying and managing data infrastructure. Problem Solving - Analytical mindset with a proactive approach to identifying and solving complex data engineering challenges.

Posted 1 day ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 day ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading Preferred technical and professional experience None

Posted 1 day ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 1 day ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Kochi

Work from Office

Naukri logo

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Practical experience in Core Java (1.8 preferred) /Python/Scala. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 day ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc) . Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions

Posted 1 day ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 1 day ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python

Posted 1 day ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Mumbai

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modelling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 1 day ago

Apply

1.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Linkedin logo

Locations : Gurgaon | Bengaluru Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do BCG X is BCG's premier data science and advanced analytics capability with 1000+ data scientists, engineers, and analysts globally. Within BCG X, our X.Delivery Teams are our hybrid client-facing/consulting-support that leverages available tools and code bases to deliver powerful and specialized insights that help our clients tackle their most pressing business problems. We partner with BCG case teams and practices across the full analytics value chain: framing new business challenges and problems, designing and executing innovative analytical solutions, and interpreting findings for our clients. Geo Analytics is a sub-team in X.Delivery with expertise in geospatial analytics and tools. Our 35+ member team is a global resource, working with clients in every BCG region and in every industry area. It is a core member of a rapidly growing analytics enterprise at BCG - a constellation of teams focused on driving practical results for BCG clients by applying leading edge technologies.As an Analyst in Geo Analytics, you will be solving a variety of location-based analytical problems for our clients. You will utilize your knowledge of geospatial data and analytics while using a variety of tools such as Geographic Information Systems (GIS), Python, and visualization/analytical platforms (ex. Tableau, Alteryx). You will leverage a range of techniques such as geospatial clustering, network analysis, location-based optimization and predictive analytics/modeling. What You'll Bring Undergraduate degree in a field linked to business analytics, statistics or geo-statistics, economics, operations research, applied mathematics, computer science, engineering, or related field is required; advanced degree is preferred 1-3 years of relevant work experience in Geo Analytics with advanced degree preferred Relevant degree in geospatial related field such as: Geography, Remote Sensing, Urban Planning, Statistics/Geo Statistics, Applied Mathematics, or Computer Science Previous experience working in a global organization or professional services firm is preferred Data management skills (e.g. data modeling, data integrity QA/QC, database administration) GIS software such as ESRI ArcGIS, QGIS, CART Data science programming/coding in Python and/or R Descriptive statistics and statistical methods (e.g. correlation, regression, clustering, etc.) with applications in machine learning Exposure to BI/Data visualization platforms such as Tableau, Power BI, etc.* Handling, processing, and deriving insights from remotely sensed data preferred #BCGXjob Who You'll Work With You will work with various teams on client projects. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Website http://www.intraedge.com Cloud Data Engineer(GCP) Experience-4+ Notice period-max 15 days Location -Gurugram(Hybrid) We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based marketing ODL applications. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization’s data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Technical Skills 1. Core Data Engineering Skills Proficiency in using GCP’s big data tools like: BigQuery : For data warehousing and SQL analytics. Dataproc : For running Spark and Hadoop clusters. Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions . 2. Programming and Scripting Strong coding skills in SQL , and Java . Familiarity with APIs and SDKs for GCP services to build custom data solutions. 3. Cloud Infrastructure Understanding of GCP services such as Cloud Storage , Compute Engine , and Cloud Functions . Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have) 4. DevOps and CI/CD Experience setting up CI/CD pipelines using Cloud Build , GitHub Actions , or other tools. Monitoring and logging tools like Cloud Monitoring and Cloud Logging for production workflows. Soft Skills 1. Innovation and Problem-Solving Ability to think creatively and design innovative solutions for complex data challenges. Experience in prototyping and experimenting with cutting-edge GCP tools or third-party integrations. Strong analytical mindset to transform raw data into actionable insights. 2. Collaboration Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders. Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences. 3. Adaptability and Continuous Learning Open to exploring new GCP features and rapidly adapting to changes in cloud technology. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Cambridge Mobile Telematics (CMT) is the world's largest telematics service provider. Its mission is to make the world's roads and drivers safer. The company's AI-driven platform, DriveWell Fusion®, gathers sensor data from millions of IoT devices — including smartphones, proprietary Tags, connected vehicles, dashcams, and third-party devices — and fuses them with contextual data to create a unified view of vehicle and driver behavior. Auto insurers, automakers, commercial mobility companies, and the public sector use insights from CMT's platform to power risk assessment, safety, claims, and driver improvement programs. Headquartered in Cambridge, MA, with offices in Budapest, Chennai, Seattle, Tokyo, and Zagreb, CMT measures and protects tens of millions of drivers across the world every day. We are in search of a collaborative, self-motivated MLOps Engineer to develop, maintain, and scale our Machine Learning platforms and infrastructure. You will work closely with our Data Science teams and work towards optimizing our utilization of Databricks, Spark, Ray, and AWS services that support building large-scale machine learning models while ensuring a balance between cost-efficiency and speed. You will empower the data science team by expediting the delivery of new models, enhancing the efficiency of feature engineering, model training, and validation processes. As an MLOps Engineer, you will play a crucial role in developing and managing the Machine Learning platform and infrastructure. You will be empowered to contribute towards the technical designs and process improvements, and will collaborate cross-functionally with data scientists, mobile developers, platform engineers, and QA engineers to foster a collaborative and innovative environment. Responsibilities: Design, develop, and maintain features for machine learning infrastructure and workflows.. Write high-quality, well-documented code and testable code. Implement infrastructure as code using tools like Terraform. Troubleshoot and resolve issues across various systems and platforms. Actively participate in team meetings, including scrums and planning sessions. Conduct and participate in code and design reviews. Provide regular updates on progress and task completion. Maintain and monitor test and production systems for stability, security, and performance. Share knowledge and contribute to internal documentation and onboarding materials.. Assume ownership of assigned tasks and ensure completion. Assist with production issues, including on-call responsibilities. Adhere to CMT's development and operational standards. Complete any additional tasks as they arise Qualifications: Bachelor's degree in Computer Science or related field, or equivalent experience/certification. Conceptual understanding or experience in AI, Machine Learning, or Data Science. Strong communication skills. Proficiency in Python and SQL. Ability to learn quickly and solve complex problems. Willingness to participate in on-call rotations. Competence in general development tasks on Linux and MacOS. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) Nice to Have Professional engineering experience. Experience with platforms like Databricks and AWS services (EC2, S3, Redshift). Experience with deep learning frameworks (TensorFlow, Keras, Torch). Experience with software development processes and scripting languages (Python, Pandas, NumPy, scikit-learn). Experience with Docker, Kubernetes, Terraform, and Jenkins CI/CD pipelines. AWS or Kubernetes Certification. Understanding of Agile methodologies (SCRUM). User-level knowledge of issue tracking systems (Jira). Compensation and Benefits: Fair and competitive salary based on skills and experience Equity may be awarded in the form of Restricted Stock Units (RSUs) Medical Benefits (Health insurance, Personal accident insurance, Group term life insurance), gratuity, parental leave, sick leave, and public holidays Employees are eligible for flexible allowances, which include Leave Travel Assistance, Telephone/Mobile Expenses, Professional development expenses, Meals Coupon, Vehicle Reimbursement Flexible scheduling and work from home policy depending on role and responsibilities Additional Perks: Feel great working to improve road safety around the world! Join one of our many employee resource groups, including Black, AAPI, LGBTQIA+, Women, Book Club, and Health & Wellness Extensive education and employee assistance programs CMT will do all that is possible to support our employees and create a positive and inclusive work environment for all! Commitment to Diversity and Inclusion: At CMT, we believe the best ideas come from a mix of backgrounds and perspectives. We are an equal-opportunity employer committed to creating a workplace and culture where everyone feels valued, respected, and empowered to bring their unique talents and perspectives. Diversity is essential to our success, and we actively seek candidates from all backgrounds to join our growing team. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status or disability state. "CMT is headquartered in Cambridge, MA. "To learn more, visit www.cmtelematics.com and follow us on X @cmtelematics. Show more Show less

Posted 1 day ago

Apply

12.0 - 17.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

The role of Sr. Analytics Consultant / Lead exists within the Analytics offshore and onshore teams to develop innovative Analytics solutions using Data Visualization / BI solutions, Gen AI / ML, Data Analytics and Data Science, that will generate tangible value for business stakeholders and customers. The role has a focus on using sophisticated data analysis and modelling techniques, alongside cognitive computing and artificial intelligence, with business acumen to discover insights that will guide business decisions and uncover business opportunities. Key AccountabilitiesManaging large Advanced Analytics teams owning client delivery and team growth accountabilities using consulting mind set. Consulting, transformation, building proposal & competencyExperience within Insurance services industry essential. Confident in leading large scale data projects, working in product teams.Highly experience in solving business problem and Data Lead Tech solutions. Managing diverse cross functional teams with a strong commercial mindset Interpret big data to identify and analyse emerging trends and produce appropriate business insights which monitors the performance of the Portfolios and to continuously drive an improvement in business results.Develop advance business performance & Data Analytics Tools to assist the Senior Specialists, Portfolio Managers, business stakeholders (including but not limited to Portfolio, Customer & Marketing functions) & wider Commercial team members to make required data-based recommendations, implement and monitor them accurately.Develop statistical models to predict business performance and customer behaviour. Research customer behaviours and attitudes leading to in depth knowledge and understanding of differences in customer level profitability.Promote innovation through improving current processes and developing new analytical methods and factors. Identify, investigate and introduce additional rating factors with the objective of improving product risk and location selection to maximise profit.Provide consulting advice and solution to solve the Business clients hardest pain points and realise biggest business opportunities through advanced use of data and algorithms. Can work on projects across functions based on needs of the business.Actively add value by lifting the Business capability of process automation.Build, enhance and maintain quality relationships with all internal and external customers. Adopt a customer focus in the delivery of internal/external services. Build positive stakeholder relationships to foster a productive environment and open communication channels.Bring new Data Science thinking to the group by staying on top of latest developments, industry meet-ups etc.Expert level knowledge of ,Gen AI/ML Python, BI and Visualization, transformation and business consulting building technical proposalKnowledge of statistical concepts Expert levelTypically, this role would have 12 years plus of relevant experience (in a Data Science or Advanced Analytical consulting field of work). At least 10 years of leadership experience necessary.Experience within Insurance services industry essential. Confident in leading large scale data projects, working in product teams.Highly experience in solving business problem and Data Lead Tech solutions. Managing diverse cross functional teams with a strong commercial mindset Qualifications Superior results in Bachelor Degree in highly technical subject area (statistics, actuarial, engineering, maths, programming etc). Post graduate degree in relevant statistical or data science related area (or equivalent & demonstrable online study). Key Capabilities/Technical Competencies (skills, knowledge, technical or specialist capabilities)MandatoryProven ability to engage in a team to achieve individual, team and divisional goals. Consulting, transformation, building proposal & competencyLead and manage largescale teams from people, project management and client management perspective Solid programming experience in Tableau, R, SQL and Python (AI/ML.Experience with AWS or other cloud service.Familiar with data lake platforms (e.g. Snowflake and Databricks)Demonstrated understanding of advanced machine learning algorithms including some exposure to NLP and Image Processing.Expert level understanding of statistical conceptsPlanning and organization Advanced levelDelegation, project management, delivery and productionizing analytical services Advanced High degree of specialist expertise within one or more data science capabilities eg. unstructured data analysis, cognitive/AI solutions (incl. use of API platforms such as IBM Watson, MS Azure, AWS etc), Hadoop/Spark based ecosystems etc.Familiarity with Gen AI conceptsHighly ValuedGood understanding of the Insurance products, industry, market environment, customer segment and key business drivers.Strong knowledge of finance, Budgeting/Forecasting of key business drivers, with the ability to interpret and analyse reports relevant to area of responsibility.Additional Creativity and Innovation - A curious mind that does not accept the status quo. Design thinking experience highly valued. Communication Skills Superior communication skills to be able to co-design solutions with customers. Emotional intelligence and the ability to communicate complex ideas to a range of internal stakeholders. Consulting skills highly valued. Business Focus - Advanced analytical skills will need to be practiced with a significant business focus to ensure practical solutions that deliver tangible value. Strategic Focus - Critically evaluate both company and key business customers' strategy. Also keeping abreast of best practice advanced analytics strategies. Change management capability - ability to recognise, understand and support need for change and anticipated impact on both the team and self. Adaptable and responsive to a continuously changing environment.Customer service - proven commitment to delivering a quality differentiated experience.Time management skills prioritisation of work without supervision.Project management - Ability to plan, organize, implement, monitor, and control projects, ensuring efficient utilisation of technical and business resources, to achieve project objectives.Partnering - Ability to deliver solutions utilising both onshore and offshore resources. Job Location

Posted 1 day ago

Apply

1.0 - 6.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

to be added Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.

Posted 1 day ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As a Data Engineer , you are required to Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level At least 3 - 5 years hands-on experience in Data Engineering, ETL. Desired Knowledge & Experience Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internalsCatalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL TSQL/Spark SQL/HiveQL Storage Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL : Cosmos, Mongo, Cassandra Cubes SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server TSQL, Stored Procedures Hadoop HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.

Posted 1 day ago

Apply

10.0 - 15.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Novo Nordisk Global Business Services ( GBS) India Department - Global Data & Artificial lntelligence Are you passionate about building scalable data pipelines and optimising data workflowsDo you want to work at the forefront of data engineering, collaborating with cross-functional teams to drive innovationIf so, we are looking for a talented Data Engineer to join our Global Data & AI team at Novo Nordisk. Read on and apply today for a life-changing career! The Position As a Senior Data Engineer, you will play a key role in designing, developing, and main-taining data pipelines and integration solutions to support analytics, Artificial Intelligence workflows, and business intelligence. It includes: Design, implement, and maintain scalable data pipelines and integration solutions aligned with the overall data architecture and strategy. Implement data transformation workflows using modern ETL/ELT approaches while establishing best practices for data engineering, including testing methodologies and documentation. Optimize data workflows by harmonizing and securely transferring data across systems, while collaborating with stakeholders to deliver high-performance solutions for analytics and Artificial Intelligence. Monitoring and maintaining data systems to ensure their reliability. Support data governance by ensuring data quality and consistency, while contributing to architectural decisions shaping the data platform's future. Mentoring junior engineers and fostering a culture of engineering excellence. Qualifications Bachelor’s or master’s degree in computer science, Software Development, Engineering. Possess over 10 years of overall professional experience, including more than 4 years of specialized expertise in data engineering. Experience in developing production-grade data pipelines using Python, Data-bricks and Azure cloud, with a strong foundation in software engineering principles. Experience in the clinical data domain, with knowledge of standards such as CDISC SDTM and ADaM (Good to have). Experience working in a regulated industry (Good to have). About the department You will be part of the Global Data & AI team. Our department is globally distributed and has for mission to harness the power of Data and Artificial Intelligence, integrating it seamlessly into the fabric of Novo Nordisk's operations. We serve as the vital link, weaving together the realms of Data and Artificial Intelligence throughout the whole organi-zation, empowering Novo Nordisk to realize its strategic ambitions through our pivotal initiatives. The atmosphere is fast-paced and dynamic, with a strong focus on collaboration and innovation. We work closely with various business domains to create actionable insights and drive commercial excellence.

Posted 1 day ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

The Manager, Software Development Engineering leads a team of technical experts in successfully executing technology projects and solutions that align with the strategy and have broad business impact. The Manager, Software Development Engineering will work closely with development teams to identify and understand key features and their underlying functionality while also partnering closely with Product Management and UX Design. They may exercise influence and govern overall end-to-end software development life cycle related activities including management of support and maintenance releases, minor functional releases, and major projects. The Manager, Software Development Engineering will lead & provide technical guidance for process improvement programs while leveraging engineering best practices. In this people leadership role, Managers will recruit, train, motivate, coach, grow and develop Software Development Engineer team members at a variety of levels through their technical expertise and providing continuous feedback to ensure employee expectations, customer needs and product demands are met. About the Role: Lead and manage a team of engineers, providing mentorship and fostering a collaborative environment. Design, implement, and maintain scalable data pipelines and systems to support business analytics and data science initiatives. Collaborate with cross-functional teams to understand data requirements and ensure data solutions align with business goals. Ensure data quality, integrity, and security across all data processes and systems. Drive the adoption of best practices in data engineering, including coding standards, testing, and automation. Evaluate and integrate new technologies and tools to enhance data processing and analytics capabilities. Prepare and present reports on engineering activities, metrics, and project progress to stakeholders. About You: Proficiency in programming languages such as Python, Java, or Scala. Data Engineering with API & any programming language. Strong understanding of APIs and possess forward-looking knowledge of AI/ML tools or models and need to have some knowledge on software architecture. Experience with cloud platforms (e.g., AWS,Google Cloud) and big data technologies (e.g., Hadoop, Spark). Experience with Rest/Odata API's Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication and interpersonal skills. Experience with data warehousing solutions such as BigQuery or snowflakes Familiarity with data visualization tools and techniques. Understanding of machine learning concepts and frameworks. #LI-AD2 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Oracle Intelligent Data Lake offers a unified developer experience built on open source standards, a data catalog capability, Apache Spark and Apache Flink for data processing, and a Jupyter Notebook for data analysis and visualization. With these combined services, customers will be able to build a data lake, connect and extend analytical applications with real-time data from any source, inventory assets, transform data, and create end-to-end data orchestration—with unified governance and security. Responsibilities As a Senior Software Developer, you will: Implement intuitive and seamless customer experiences. Participate in technical discussions and translate those into technical dept and implement. Work closely with product managers and UX developers to understand requirements, translate them into technical specifications, and deliver features in a timely manner. Ability to quickly translate wireframes into prototypes and production-ready interfaces. Participate in code reviews, providing constructive feedback and ensuring code quality and adherence to coding standards. Required Qualifications / Skills 3+ years of frontend development experience Bachelor's or Master's in Computer Science, Engineering, or a related field. Proficient in ES6, Javascript / Typescript, ReactJS Strong skills in HTML5/CSS3, including CSS preprocessors such as LESS. Experience with large Single Page Application with state management. Experience with integration of backend service using REST APIs. Experience with Unit Test frameworks and E2E test frameworks. Experience with best practices to ensure inclusive design Experience working across multiple continents/culture. Proficient in GIT and distributed source control. Experience working in Cloud infrastructure. Good To Have Skills Experience in creating micro frontends. Knowledge in performance optimization techniques for front end application. Experience with REST API development. Understanding and mitigating common frontend security vulnerabilities. Experience with building reusable UI components. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 day ago

Apply

7.0 - 9.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Want to be part of the Data & Analytics organization, whose strategic goal is to create a world-class Data & Analytics company by building, embedding, and maturing a data-driven culture across Thomson Reuters. About The Role We are looking for a highly motivated individual with strong organizational and technical skills for the position of Lead Data Engineer/ Data Engineering Manager (Snowflake). You will play a critical role working on cutting edge of Data Engineering and analytics, leveraging predictive models, machine learning and generative AI to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale data-driven initiatives.Effectively communicate across various levels, including Executives, and functions within the global organization.Demonstrate strong leadership skills with ability to drive projects/tasks to delivering valueEngage with stakeholders, business analysts and project team to understand the data requirements.Design analytical frameworks to provide insights into a business problem.Explore and visualize multiple data sets to understand data available and prepare data for problem solving.Design database models (if a data mart or operational data store is required to aggregate data for modeling). About You You're a fit for the Lead Data Engineer/ Data Engineering Manager (Snowflake), if your background includes:QualificationsB-Tech/M-Tech/MCA or equivalentExperience7-9 years of corporate experienceLocationBangalore, IndiaHands-on experience in developing data models for large scale data warehouse/data Lake Snowflake, BWMap the data journey from operational system sources through any transformations in transit to itsdelivery into enterprise repositories (Warehouse, Data Lake, Master Data, etc.)Enabling on the overall master and reference data strategy, including the procedures to ensure the consistency and quality of Finance reference data.Experience across ETL, SQL and other emerging data technologies with experience in integrations of a cloud-based analytics environmentBuild and refine end-to-end data workflows to offer actionable insightsFair understanding of Data Strategy, Data Governance ProcessKnowledge in BI analytics and visualization toolsPower BI, Tableau #LI-NR1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 day ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Python Developer Location: Bengaluru, Karnataka, India Experience Level: 35 Years Employment Type: Full-Time Role Overview We are seeking a skilled Python Developer with a strong background in data manipulation and analysis using NumPy and Pandas, coupled with proficiency in SQL. The ideal candidate will have experience in building and optimizing data pipelines, ensuring efficient data processing and integration. Key Responsibilities Develop and maintain robust data pipelines and ETL processes using Python, NumPy, and Pandas. Write efficient SQL queries for data extraction, transformation, and loading. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Implement data validation and quality checks to ensure data integrity. Optimize existing codebases for performance and scalability. Document processes and maintain clear records of data workflows. Required Qualifications Bachelors degree in Computer Science, Engineering, or a related field. 25 years of professional experience in Python development. Proficiency in NumPy and Pandas for data manipulation and analysis. Strong command of SQL and experience with relational databases like MySQL, PostgreSQL, or SQL Server. Familiarity with version control systems, particularly Git. Experience with data visualization tools and libraries is a plus. Preferred Skills Experience with data visualization libraries such as Matplotlib or Seaborn. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of big data tools and frameworks like Spark or Hadoop. Understanding of machine learning concepts and libraries. Why Join Enterprise Minds Enterprise Minds is a forward-thinking technology consulting firm dedicated to delivering next-generation solutions. By joining our team, you'll work on impactful projects, collaborate with industry experts, and contribute to innovative solutions that drive business transformation.

Posted 1 day ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

About the Role In this opportunity, as Senior Data Engineer, you will: Develop and maintain data solutions using resources such as dbt, Alteryx, and Python. Design and optimize data pipelines, ensuring efficient data flow and processing. Work extensively with databases, SQL, and various data formats including JSON, XML, and CSV. Tune and optimize queries to enhance performance and reliability. Develop high-quality code in SQL, dbt, and Python, adhering to best practices. Understand and implement data automation and API integrations. Leverage AI capabilities to enhance data engineering practices. Understand integration points related to upstream and downstream requirements. Proactively manage tasks and work towards completion against tight deadlines. Analyze existing processes and offer suggestions for improvement. About You Youre a fit for the role of Senior Data Engineer if your background includes: Strong interest and knowledge in data engineering principles and methods. 6+ years of experience developing data solutions or pipelines. 6+ years of hands-on experience with databases and SQL. 2+ years of experience programming in an additional language. 2+ years of experience in query tuning and optimization. Experience working with SQL, JSON, XML, and CSV content. Understanding of data automation and API integration. Familiarity with AI capabilities and their application in data engineering. Ability to adhere to best practices for developing programmatic solutions. Strong problem-solving skills and ability to work independently. #LI-HG1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 day ago

Apply

6.0 - 11.0 years

10 - 14 Lacs

Hyderabad, Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 10 Position Title Senior Software Developer The Team Do you love to collaborate & provide solutionsThis team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. Whats in it for you - Opportunities for innovation and learning new state of the art technologies - To work in pure agile & scrum methodology Responsibilities : Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What were Looking For : Basic Qualifications : Bachelor's degree in Computer Science or Equivalent 6+ years related experience Passionate, smart, and articulate developer Strong C#, WPF and SQL skills Experience implementingWeb Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Interest and experience in Environmental and Sustainability content is a plus Agile/Scrum experience a plus Exposure to Data Engineering & Big Data technologies like Hadoop, Spark/Scala, Nifi & ETL is a plus Preferred Qualifications : Experience on Docker is a plus Experience working in cloud computing environments such as AWS, Azure or GCP Experience with large scale messaging systems such as Kafka or RabbitMQ or commercial systems. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 day ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 11 S&P Global Market Intelligence The Role: Lead Software Engineer The Team The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering and distribution to power our Financial Institution Group business and customer needs. Focus on platform scale to support business by following a common data lifecycle that accelerates business value. We provide essential intelligence for Financial Services, Real Estate and Insurance industries The Impact The FIG Data Engineering team will be responsible for implementing & maintaining services and/or tools to support existing feed systems which allows users to consume FIG datasets and make FIG data available to a data fabric for wider consumption and processing within the company. Whats in it for you Ability to work with global stakeholders and working on latest tools and technologies. Responsibilities Build new data acquisition and transformation pipelines using big data and cloud technologies. Work with the broader technology team, including information architecture and data fabric teams to align pipelines with the lodestone initiative. What Were Looking For: Bachelors in computer science or equivalent with at least 8+ years of professional software work experience Experience with Big Data platforms such as Apache Spark, Apache Airflow, Google Cloud Platform, Apache Hadoop. Deep understanding of REST, good API design, and OOP principles Experience with object-oriented/object function scripting languages Python , C#, Scala, etc. Good working knowledge of relational SQL and NoSQL databases Experience in maintaining and developing software in production utilizing cloud-based tooling ( AWS, Docker & Kubernetes, Okta) Strong collaboration and teamwork skills with excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced software development environment Agile experience highly desirable Experience in Snowflake, Databricks will be a big plus. Return to Work Have you taken time out for caring responsibilities and are now looking to return to workAs part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 day ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Noida

Work from Office

Naukri logo

: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the team: SAP Support team is facilitate to existing ERP support and new implement processes for all the Paytm entities. About the role: 1. Interacting with users on a day-to-day basis for a timely closing of the tickets. 2. Assisting users in Period end activities. 3. Preparation of functional specifications and interacting with Abapers to develop. 4. Attending and Conducting the Weekly Meetings with the SAP and Core team members. 5. Proactively discuss critical issues with other consultants 6. Engage with various Business & Technology Teams within Paytm to identify common bottlenecks esp. on Technology front. 7. Track Program/Project performance, specifically to analyze the successful completion of short- and long-term goals. 8. Enable and encourage the use of common services to increase the speed of development and execution. 9. Working on Projects of SAP sub-modules of FICOGeneral Ledger (FI-GL) , Accounts Payable (FI-AP). Accounts Receivable (FI-AR), Asset Accounting (FI-AA). 10. Integration of FI-MM & FI-SD Module 11. Resolving the issues within the time limit specified in SLAs. Expectations/ : - Minimum 8 years of experience in SAP sub Modules of FICO - Program manages initiatives that are driven centrally for Technology improvements - Aware of Building the Functional Specification and Testing process. - Experience in at least one full cycle implementation project, Experience in the process design, configuration, support, and troubleshooting of S/4HANA across multiple industries. - Awareness of business function of the GRDC, SAC, SD & MM modules and development lifecycle for ABAP & Fiori developments with RICEFs. - Aware of Building the Functional Specification and Testing process. -High level of drive, initiative and self-motivation - Ability to take internal and external stakeholders along. EducationCA, MBA in finance, BTech is preffered OR Any Graduate. Why join us: Because you get an opportunity to make a difference and have a great time doing that. You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants, and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 1 day ago

Apply

8.0 - 12.0 years

25 - 40 Lacs

Bengaluru

Hybrid

Naukri logo

Senior Software Engineer Location: Bengaluru As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: • Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: • Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 1 day ago

Apply

Exploring Spark Jobs in India

The demand for professionals with expertise in Spark is on the rise in India. Spark, an open-source distributed computing system, is widely used for big data processing and analytics. Job seekers in India looking to explore opportunities in Spark can find a variety of roles in different industries.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities have a high concentration of tech companies and startups actively hiring for Spark roles.

Average Salary Range

The average salary range for Spark professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Salaries may vary based on the company, location, and specific job requirements.

Career Path

In the field of Spark, a typical career progression may look like: - Junior Developer - Senior Developer - Tech Lead - Architect

Advancing in this career path often requires gaining experience, acquiring additional skills, and taking on more responsibilities.

Related Skills

Apart from proficiency in Spark, professionals in this field are often expected to have knowledge or experience in: - Hadoop - Java or Scala programming - Data processing and analytics - SQL databases

Having a combination of these skills can make a candidate more competitive in the job market.

Interview Questions

  • What is Apache Spark and how is it different from Hadoop? (basic)
  • Explain the difference between RDD, DataFrame, and Dataset in Spark. (medium)
  • How does Spark handle fault tolerance? (medium)
  • What is lazy evaluation in Spark? (basic)
  • Explain the concept of transformations and actions in Spark. (basic)
  • What are the different deployment modes in Spark? (medium)
  • How can you optimize the performance of a Spark job? (advanced)
  • What is the role of a Spark executor? (medium)
  • How does Spark handle memory management? (medium)
  • Explain the Spark shuffle operation. (medium)
  • What are the different types of joins in Spark? (medium)
  • How can you debug a Spark application? (medium)
  • Explain the concept of checkpointing in Spark. (medium)
  • What is lineage in Spark? (basic)
  • How can you monitor and manage a Spark application? (medium)
  • What is the significance of the Spark Driver in a Spark application? (medium)
  • How does Spark SQL differ from traditional SQL? (medium)
  • Explain the concept of broadcast variables in Spark. (medium)
  • What is the purpose of the SparkContext in Spark? (basic)
  • How does Spark handle data partitioning? (medium)
  • Explain the concept of window functions in Spark SQL. (advanced)
  • How can you handle skewed data in Spark? (advanced)
  • What is the use of accumulators in Spark? (advanced)
  • How can you schedule Spark jobs using Apache Oozie? (advanced)
  • Explain the process of Spark job submission and execution. (basic)

Closing Remark

As you explore opportunities in Spark jobs in India, remember to prepare thoroughly for interviews and showcase your expertise confidently. With the right skills and knowledge, you can excel in this growing field and advance your career in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies