Home
Jobs

3773 Scala Jobs - Page 44

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more careers.bms.com/working-with-us . Position Senior Data Engineer Location Hyderabad, India At Bristol Myers Squibb, we are inspired by a single vision - transforming patients' lives through science. In oncology, hematology, immunology, and cardiovascular disease - and one of the most diverse and promising pipelines in the industry - each of our passionate colleagues contribute to innovations that drive meaningful change. We bring a human touch to every treatment we pioneer. Join us and make a difference. Position Summary At BMS, digital innovation and Information Technology are central to our vision of transforming patients' lives through science. To accelerate our ability to serve patients around the world, we must unleash the power of technology. We are committed to being at the forefront of transforming the way medicine is made and delivered by harnessing the power of computer and data science, artificial intelligence, and other technologies to promote scientific discovery, faster decision making, and enhanced patient care. If you want an exciting and rewarding career that is meaningful, consider joining our diverse team! As a Data Engineer based out of our BMS Hyderabad you are part of the Data Platform team along with supporting the larger Data Engineering community, that delivers data and analytics capabilities across different IT functional domains. The ideal candidate will have a strong background in data engineering, DataOps, cloud native services, and will be comfortable working with both structured and unstructured data. Key Responsibilities The Data Engineer will be responsible for designing, building, and maintaining the ETL pipelines, data products, evolution of the data products, and utilize the most suitable data architecture required for our organization's data needs. Responsible for delivering high quality, data products and analytic ready data solution Work with an end-to-end ownership mindset, innovate and drive initiatives through completion. Develop and maintain data models to support our reporting and analysis needs Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Implement and maintain security protocols to protect sensitive data Stay up-to-date with emerging trends and technologies in data engineering and analytics Closely partner with the Enterprise Data and Analytics Platform team, other functional data teams and Data Community lead to shape and adopt data and technology strategy. Serves as the Subject Matter Expert on Data & Analytics Solutions. Knowledgeable in evolving trends in Data platforms and Product based implementation Has end-to-end ownership mindset in driving initiatives through completion Comfortable working in a fast-paced environment with minimal oversight Mentors other team members effectively to unlock full potential Prior experience working in an Agile/Product based environment Qualifications & Experience 7+ years of hands-on experience working on implementing and operating data capabilities and cutting-edge data solutions, preferably in a cloud environment. Breadth of experience in technology capabilities that span the full life cycle of data management including data lakehouses, master/reference data management, data quality and analytics/AI ML is needed. In-depth knowledge and hands-on experience with ASW Glue services and AWS Data engineering ecosystem. Hands-on experience developing and delivering data, ETL solutions with some of the technologies like AWS data services (Redshift, Athena, lakeformation, etc.), Cloudera Data Platform, Tableau labs is a plus 5+ years of experience in data engineering or software development Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Strong programming skills in languages such as Python, R, PyTorch, PySpark, Pandas, Scala etc. Experience with SQL and database technologies such as MySQL, PostgreSQL, Presto, etc. Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud Platform Strong analytical and problem-solving skills Excellent communication and collaboration skills Functional knowledge or prior experience in Lifesciences Research and Development domain is a plus Experience and expertise in establishing agile and product-oriented teams that work effectively with teams in US and other global BMS site. Initiates challenging opportunities that build strong capabilities for self and team Demonstrates a focus on improving processes, structures, and knowledge within the team. Leads in analyzing current states, deliver strong recommendations in understanding complexity in the environment, and the ability to execute to bring complex solutions to completion. Why You Should Apply Around the world, we are passionate about making an impact on the lives of patients with serious diseases. Empowered to apply our individual talents and diverse perspectives in an inclusive culture, our shared values of passion, innovation, urgency, accountability, inclusion, and integrity bring out the highest potential of each of our colleagues. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Our company is committed to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace adjustments and ongoing support in their roles. Applicants can request an accommodation prior to accepting a job offer. If you require reasonable accommodation in completing this application, or any part of the recruitment process direct your inquiries to adastaffingsupport@bms.com. Visit careers.bms.com/eeo-accessibility to access our complete Equal Employment Opportunity statement. If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science™ , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com . Visit careers.bms.com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information https //careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it’s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers – you will work with product and engineering leaders to define data solutions that support customers’ business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions – you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth – provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions – help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka, Terraform Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master’s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Hands-on experience with Docker, Kubernetes, infrastructure as code using Terraform, and Kubernetes package management with Helm charts Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics. Show more Show less

Posted 1 week ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Mohali

Remote

Apna logo

Dear Students, We are excited to share an excellent opportunity to kick-start your professional journey! Hi-Path Technologies, a sister concern of RevClerx Pvt. Ltd., is committed to nurturing young talent. Please review the job description below and share your CV at suman.sharma@revclerx.com. Job description Maintenance of CCTV,X-Ray machine, RFID gates & handle the Scala display board with Network Switches ( any problem in display Board and other Equipment's we solve it ) also working on other sites like cabling and installation of products on sight. Candidate must be willing to do field job.

Posted 1 week ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Data Engineer who can own the design, development & maintenance of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain complex ETL/ELT data pipelines in Databricks using PySpark, Scala, and SQL to process large-scale datasets Understand the biotech/pharma or related domains & build highly efficient data pipelines to migrate and deploy complex data across systems Design and Implement solutions to enable unified data access, governance, and interoperability across hybrid cloud environments Ingest and transform structured and unstructured data from databases (PostgreSQL, MySQL, SQL Server, MongoDB etc.), APIs, logs, event streams, images, pdf, and third-party platforms Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Expert in data quality, data validation and verification frameworks Innovate, explore and implement new tools and technologies to enhance efficient data processing Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Minimum 5 to 8 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 1 week ago

Apply

11.0 - 16.0 years

27 - 32 Lacs

Hyderabad

Work from Office

Naukri logo

Director - Portfolio Operations Delivery What you will do Let’s do this. Let’s change the world. In this vital role the Director, Portfolio Effectiveness and Optimization Results Delivery within the Customer Data & Analytics team is accountable for coordinating our delivery efforts across the internal and external team located in AIN and across India. In addition, the Director must manage relationships across a complex internal set of teams and functional groups. This position reports to the Associate Vice President, Portfolio Effectiveness and Optimization and will be responsible for the following Responsibilities Key IntegratorAct as main point of contact and representative of the Portfolio Effectiveness and Optimization team in India Talent DevelopmentHire, train, develop, and manage talent to meet organizational needs Global CollaborationAct as the primary point of contact for PE&O senior leadership in the US and the offshore team in India, either through our Contract teams or direct AIN FTEs Operational Excellence and DeliveryOversee end-to-end delivery of core data and analytics projects ensuring quality, scalability, and operational efficiency, while promoting standard processes in data governance and analytics methodologies Offshore Vendor ManagementManage offshore teams including CWs, maintaining quality of service and timely deliverables Innovation LeadershipFoster a culture of innovation, ensuring the India team remains at the forefront of emerging technologies and trends in analytics, AI Business Impact & Collaborator ManagementEnsure analytics solutions drive tangible business outcomes and collaborate with global key collaborators to refine requirements, measure impact, and report progress Financial managementOversee PE&O budget associated with offshore work in India, ensuring best negotiated rates and overall value What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree and 4 years of statistics, operations research, mathematics, econometrics, business administration or a quantitative field experience OR Master’s degree and 14 to 16 years of statistics, operations research, mathematics, econometrics, business administration or a quantitative field experience OR Bachelor’s degree and 16 to 18 years of statistics, operations research, mathematics, econometrics, business administration or a quantitative field experience Managerial experience, directly handling people and/or leadership experience leading teams, projects, programs or directing the allocation of resources Preferred Qualifications: Relevant data science certifications and Bio/Pharmaceutical industry experience 8+ years of innovative Data Science/Advanced Analytics leadership experience Experience in AI, Machine Learning, quantitative methods, multivariate statistics, predictive modelling and other analytics frameworks/techniques with 10+ years of experience delivering complex analytical projects Minimum 5 years of professional experience in Amazon Web ServicesRedShift, S3, Athena, etc. and industry standard Data Warehousing technologiesSnowflake, Spark, Airflow, etc. Advanced proficiency and hands on coding experience in Python/R/Scala/Java or any other Object-Oriented Programming language; ETL using SQL/shell scripting Experience in successfully completing AI/ML based Next Best Action recommendation engine to optimize against desired objective function(s) Expertise in setting up and measuring randomized controlled trials, cohort studies, and matched, case-control studies Comprehensive understanding of the components of setting up data models and running scenario planning that match the business need Experience in setting up process for data ingestion, Quality Checks etc. Thorough understanding of tagging, Google Analytics, CRM, Content Management Systems, and other components of a Digital Marketing Ecosystem. Leadership experience in building and developing dedication teams, delivering results, and shaping the future Ability to foster and encourage an environment of openness and transparency in seeking diverse opinions and empower risk-taking in idea generation, idea incubation and/or experimentation The ideal candidate will lead the creation of an analytics-driven culture that drives top-line growth, controls costs, and takes timely corrective action to reduce risks that derail plans Ability to think strategically about issues impacting an entire portfolio of therapeutics across geographies and stages of development Experience managing multiple senior key collaborators, prioritizing across a multitude of responsibilities and allocating resources to drive maximum impact Partners with business leaders to deliver high-quality predictions that guide strategic decision making Oral, written and presentation skills to explain complex concepts and controversial findings clearly to a variety of audiences, including senior management Comfortable challenging the status quo and bringing forward innovative solutions Ability to identify areas for process and systems innovation and implement change that will enhance the overall effectiveness of the team Comfortable working through and leading large-scale global change management Understanding of technology platforms and ability to partner with IS/IT and business leaders What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

3.0 - 7.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of Computer Science, IT or related field OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field OR Diploma and 10 to 12 years of Computer Science, IT or related field Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications (please mention if the certification is preferred or required for the role): AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

DATA SCIENCE + GEN AI Major Duties & Responsibilities: Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions. Create Proof of Concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects. Influence machine learning strategy for Digital programs and projects. Make solution recommendations that appropriately balance speed to market and analytical soundness. Explore design options to assess efficiency and impact, and develop approaches to improve robustness and rigor. Develop analytical/modeling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow). Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations. Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories. Create algorithms to extract information from large, multiparametric data sets. Deploy algorithms to production to identify actionable insights from large databases. Compare results from various methodologies and recommend optimal techniques. Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories. Develop and embed automated processes for predictive model validation, deployment, and implementation. Work on multiple pillars of AI, including cognitive engineering, conversational bots, and data science. Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment. Lead discussions at peer reviews and use interpersonal skills to positively influence decision making. Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices. Facilitate cross-geography sharing of new ideas, learnings, and best practices. Required Qualifications: Educational Requirement : Bachelor of Science or Bachelor of Engineering (at a minimum). Experience : 4+ years of work experience as a Data Scientist. Skills : A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to quickly cycle hypotheses through the discovery phase of a project. Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala). Good hands-on skills in both feature engineering and hyperparameter optimization. Experience producing high-quality code, tests, and documentation. Experience with Microsoft Azure or AWS data management tools such as Azure Data Factory, Data Lake, Azure ML, Synapse, Databricks. Understanding of descriptive and exploratory statistics, predictive modeling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and deep learning methodologies. Proficiency in statistical concepts and machine learning algorithms. Good knowledge of Agile principles and processes. Ability to lead, manage, build, and deliver customer business results through data scientists or professional services teams. Ability to share ideas compellingly, summarize and communicate data analysis assumptions and results. Self-motivated and a proactive problem solver who can work independently and in teams. Show more Show less

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Bachelor s or master s degree in computer science, Information Technology, Data Science, or a related field. Must have minimum 4 years of relevant experience Proficient in Python with hands-on experience building ETL pipelines for data extraction, transformation, and validation. Strong SQL skills for working with structured data. Familiar with Grafana or Kibana for data visualization and monitoring/dashboards. Experience with databases such as MongoDB, Elasticsearch, and MySQL. Comfortable working in Linux environments using common Unix tools. Hands-on experience with Git, Docker and virtual machines.

Posted 1 week ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Preferred Education Master's Degree Required Technical And Professional Expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred Technical And Professional Experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences Show more Show less

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Primary Responsibilities Provide engineering leadership, mentorship, technical direction to small team of other engineers (~6 members). Partner with your Engineering Manager to ensure engineering tasks understood, broken down and implemented to the highest of quality standards. Collaborate with members of the team to solve challenging engineering tasks on time and with high quality. Engage in code reviews and training of team members. Support continuous deployment pipeline code. Situationally troubleshoot production issues alongside the support team. Continually research and recommend product improvements. Create and integrate features for our enterprise software solution using the latest Python technologies. Assist and adhere to enforcement of project deadlines and schedules. Evaluate, recommend, and proposed solutions to existing systems. Actively communicate with team members to clarify requirements and overcome obstacles to meet the team goals. Leverage open-source and other technologies and languages outside of the Python platform. Develop cutting-edge solutions to maximize the performance, scalability, and distributed processing capabilities of the system. Provide troubleshooting and root cause analysis for production issues that are escalated to the engineering team. Work with development teams in an agile context as it relates to software development, including Kanban, automated unit testing, test fixtures, and pair programming. Requirement of 4-8or more years experience as a Python developer on enterprise projects using Python, Flask, FastAPI, Django, PyTest, Celery and other Python frameworks. Software development experience includingobject-oriented programming, concurrency programming, modern design patterns, RESTful service implementation, micro-service architecture, test-driven development, and acceptance testing. Familiarity with tools used to automate the deployment of an enterprise software solution to the cloud, Terraform, GitHub Actions, Concourse, Ansible, etc. Proficiency with Git as a version control system Experience with Docker and Kubernetes Experience with relational SQL and NoSQL databases, including MongoDB and MSSQL. Experience with object-oriented languagesPython, Java, Scala, C#, etc. Experience with testing tools such as PyTest, Wiremock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as BigQuery, GKE, GCS, DataFlow, Kubeflow, and/or VertexAI Excellent problem solving and communication skills. Experience with Java and Spring a big plus. Disability Accommodation: UKGCareers@ukg.com.

Posted 1 week ago

Apply

8.0 - 13.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Location Bengaluru : We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, with a strong focus on Databricks, Python, and SQL. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure to support various business needs. Key Responsibilities Develop and implement efficient data pipelines and ETL processes to migrate and manage client, investment, and accounting data in Databricks Work closely with the investment management team to understand data structures and business requirements, ensuring data accuracy and quality. Monitor and troubleshoot data pipelines, ensuring high availability and reliability of data systems. Optimize database performance by designing scalable and cost-effective solutions. What s on offer Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and inclusive work environment. The chance to work on impactful projects with a talented team. Candidate Profile Experience: 8+ years of experience in data engineering or a similar role. Proficiency in Apache Spark. Databricks Data Cloud, including schema design, data partitioning, and query optimization Exposure to Azure. Exposure to Streaming technologies. (e.g Autoloader, DLT Streaming) Advanced SQL, data modeling skills and data warehousing concepts tailored to investment management data (e.g., transaction, accounting, portfolio data, reference data etc). Experience with ETL/ELT tools like snap logic and programming languages (e.g., Python, Scala, R programing). Familiarity workload automation and job scheduling tool such as Control M. Familiar with data governance frameworks and security protocols. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Education Bachelor s degree in computer science, IT, or a related discipline. Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Sailpoint Job Location : Bangalore / Chennai / Hyderabad / Pune/DelhiNCR Work Mode : Work From Office (5 Days) One of our esteemed clients, a CMM Level 5 organization, is planning to fill over 1000 positions for SAP All Modules (MM with WM /Basis/SD /Basis HANA). This is an incredible chance for you to take your career to new heights. SAP PP/QM/ SAP PS/ SAP CPI/ SAP MDG/ SAP BODS/ Open Text/ S/4 SAP Fiori/ S/4 SAP FICO/ S/4 SAP ABAP/ SAP Basis Admin/ SAP Basis with S4/HANA/ SAP ABAP BW/ SAP BOBJ Admin/ SAP Basis/ BOBJ Admin/SAP PP/BODS/WM/Data Migration/SAP Security/ITGC Control and Audit/Abinitio / Desktop Support /ServiceDesk/MuleSoft/SOC/Cyber security/Scala /Sailpoint Allow us to provide you with more details: 1. Payroll and Location: If selected, you will be working on the payroll of our organization, Diverse Lynx India Pvt. Ltd., and stationed at our clients office located in Hyderabad. (WFO is Mandatory) 2. Interview Process: Once your profile has been shortlisted by our technical panel, we will promptly arrange a face-to-face interview/Virtual Interview for you. Rest assured that we will keep you informed every step of the way. Kindly help us with the interview slot date & timing so that we can line up the project team for technical evaluation. 3. Selection Confirmation: We understand how important it is to receive timely updates during the hiring process. Therefore, we are committed to providing confirmation of your selection on the same day as your interview. This is an excellent opportunity for professionals like yourself who are seeking growth and development in the SAP field. Must have Implementation/Support experience with minimum 4 Years of total experience. To book your interview slots; please help with the below details and you can connect with our Head directly for any further information & support. Experience- 4 Years-10 Years. -Full Name (As per PAN Card) -PAN Card No: -Date of Birth- DD/MM/YEAR/- -Total Experience - -Relevant Experience in the given Skill set - -Highest Qualification and Passing Year with Date 10th Onwards: - -Current Location - -Preferred Location - -Current Company (Parent Company) -Contact No. - -Mail Id - -Notice Period - -Reason for Change - -Current CTC -Expected CTC Best Regards Manish Prasad Account Manager(Human Resource-Recruitment) Diverse Lynx India Pvt. Ltd. Email ID:- manish.prasad@diverselynx.in URL: http://www.diverselynx.in

Posted 1 week ago

Apply

9.0 - 12.0 years

0 - 3 Lacs

Hyderabad

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 11 The Team: Our team is responsible for the design, architecture, and development of our client facing applications using a variety of tools that are regularly updated as new technologies emerge. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: The work you do will be used every single day, its the essential code youll write that provides the data and analytics required for crucial, daily decisions in the capital and commodities markets. Whats in it for you: Build a career with a global company. Work on code that fuels the global financial markets. Grow and improve your skills by working on enterprise level products and new technologies. Responsibilities: Solve problems, analyze and isolate issues. Provide technical guidance and mentoring to the team and help them adopt change as new processes are introduced. Champion best practices and serve as a subject matter authority. Develop solutions to develop/support key business needs. Engineer components and common services based on standard development models, languages and tools Produce system design documents and lead technical walkthroughs Produce high quality code Collaborate effectively with technical and non-technical partners As a team-member should continuously improve the architecture Basic Qualifications: 9-12 years of experience designing/building data-intensive solutions using distributed computing. Proven experience in implementing and maintaining enterprise search solutions in large-scale environments. Experience working with business stakeholders and users, providing research direction and solution design and writing robust maintainable architectures and APIs. Experience developing and deploying Search solutions in a public cloud such as AWS. Proficient programming skills at a high-level languages - Java, Scala, Python Solid knowledge of at least one machine learning research frameworks Familiarity with containerization, scripting, cloud platforms, and CI/CD. 5+ years experience with Python, Java, Kubernetes, and data and workflow orchestration tools 4+ years experience with Elasticsearch, SQL, NoSQL,??Apache spark, Flink, Databricks and Mlflow. Prior experience with operationalizing data-driven pipelines for large scale batch and stream processing analytics solutions Good to have experience with contributing to GitHub and open source initiatives or in research projects and/or participation in Kaggle competitions Ability to quickly, efficiently, and effectively define and prototype solutions with continual iteration within aggressive product deadlines. Demonstrate strong communication and documentation skills for both technical and non-technical audiences. Preferred Qualifications: Search Technologies: Query and Indexing content for Apache Solr, Elastic Search, etc. Proficiency in search query languages (e.g., Lucene Query Syntax) and experience with data indexing and retrieval. Experience with machine learning models and NLP techniques for search relevance and ranking. Familiarity with vector search techniques and embedding models (e.g., BERT, Word2Vec). Experience with relevance tuning using A/B testing frameworks. Big Data Technologies: Apache Spark, Spark SQL, Hadoop, Hive, Airflow Data Science Search Technologies: Personalization and Recommendation models, Learn to Rank (LTR) Preferred Languages: Python, Java Database Technologies: MS SQL Server platform, stored procedure programming experience using Transact SQL. Ability to lead, train and mentor.

Posted 1 week ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 08 One of the most valuable asset in today's Financial industry is the data which can provide businesses the intelligence essential to making business and financial decisions with conviction. This role will provide an opportunity to you to work on Ratings and Research related data. You will get an opportunity to work on cutting edge big data technologies and will be responsible for development of both Data feeds as well as API work. Location: Hyderabad The Team: RatingsXpress is at the heart of financial workflows when it comes to providing and analyzing data. We provide Ratings and Research information to clients . Our work deals with content ingestion, data feeds generation as well as exposing the data to clients via API calls. This position in part of the Ratings Xpresss team and is focused on providing clients the critical data they need to make the most informed investment decisions possible. Impact: As a member of the Xpressfeed Team in S&P Global Market Intelligence, you will work with a group of intelligent and visionary engineers to build impactful content management tools for investment professionals across the globe. Our Software Engineers are involved in the full product life cycle, from design through release. You will be expected to participate in application designs , write high-quality code and innovate on how to improve the overall system performance and customer experience. If you are a talented developer and want to help drive the next phase for Data Management Solutions at S&P Global and can contribute great ideas, solutions and code and understand the value of Cloud solutions, we would like to talk to you. Whats in it for you: We are currently seeking a Software Developer with a passion for full-stack development. In this role, you will have the opportunity to work on cutting-edge cloud technologies such as Databricks , Snowflake , and AWS , while also engaging in Scala and SQL Server -based database development. This position offers a unique opportunity to grow both as a Full Stack Developer and as a Cloud Engineer , expanding your expertise across modern data platforms and backend development. Responsibilities: Analyze, design and develop solutions within a multi-functional Agile team to support key business needs for the Data feeds Design, implement and test solutions using AWS EMR for content Ingestion. Work on complex SQL server projects involving high volume data Engineer components, and common services based on standard corporate development models, languages and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. Basic Qualifications: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. 3--6 years of experience in application development. Minimum of 2 years of hands-on experience with Scala. Minimum of 2 years of hands-on experience with Microsoft SQL Server. Solid understanding of Amazon Web Services (AWS) and cloud-based development. In-depth knowledge of system architecture, object-oriented programming, and design patterns. Excellent communication skills, with the ability to convey complex ideas clearly both verbally and in writing. Preferred Qualifications: Familiarity with AWS Services, EMR, Auto scaling, EKS Working knowledge of snowflake. Preferred experience in Python development. Familiarity with the Financial Services domain and Capital Markets is a plus. Experience developing systems that handle large volumes of data and require high computational performance.

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Why We Work at Dun & Bradstreet Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us! Learn more at dnb.com/careers . We are seeking a highly skilled senior DevOps engineer with a strong background in managing cloud-based applications and infrastructure. The ideal candidate will have deep expertise in AWS and GCP, infrastructure automation using Terraform, and building scalable CI/CD pipelines using tools like Jenkins and Harness. You will play a key role in optimizing development workflows, ensuring system reliability, and enabling efficient cloud operations. Key Responsibilities Design, implement, and maintain scalable, secure, and highly available cloud infrastructure on AWS and GCP. Develop and manage Infrastructure as Code (IaC) using Terraform for consistent and repeatable infrastructure provisioning. Build, monitor, and optimize CI/CD pipelines using Jenkins, Harness, and other DevOps tools to support continuous delivery. Collaborate with development and QA teams to automate and streamline build, test, and release processes. Monitor application and infrastructure performance, identify issues, and implement automation for alerting and recovery. Manage and enhance containerization and orchestration platforms (e.g., Docker, Kubernetes). Implement best practices for security, compliance, and cost optimization in cloud environments. Key Skills: Strong expertise in cloud platforms: AWS (EC2, S3, IAM, RDS, Lambda, etc.) and GCP (Compute Engine, Cloud Functions, GKE, etc.). Proficiency in Terraform and cloud formations for infrastructure automation. Solid experience building and maintaining CI/CD pipelines using Jenkins and Harness. Good understanding of Linux system administration and networking fundamentals. Experience with monitoring tools (e.g., CloudWatch, Splunk). Working knowledge of containers and orchestration platforms like Docker and Kubernetes. Familiarity with scripting languages like Python or Scala. Strong problem-solving and troubleshooting skills. Experience in managing cost optimization and security in cloud environments

Posted 1 week ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Senior and Lead Engineer Full Stack Sahaj Software Senior and Lead Engineer Full Stack Location: Bengaluru | Chennai | Hyderabad | London | Pune About the role You ll thrive if you re hands-on, grounded, and passionate about building with technology. Our diverse tech stack includes TypeScript, Java, Scala, Kotlin, Golang, Elixir, Python, .Net, Node.js, and Rust. This role offers significant impact and growth opportunities while staying hands-on. We focus on lean teams without traditional management layers, working in small, collaborative teams (2-5 people) where a well-founded argument holds more weight than the years of experience. You ll develop tailored software solutions to meet clients unique needs across multiple domains. Responsibilities Remain fully hands-on and write high-quality, production-ready code that enables smooth deployment of solutions. Lead architecture and design decisions, ensuring adherence to best practices in technology choices and system design. Utilize DevOps tools and practices to automate and streamline the build and deployment processes. Work closely with Data Scientists and Engineers to deliver robust, production-level AI and Machine Learning systems. Develop frameworks and tools for

Posted 1 week ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

ECMS Number - Skill ADB+ADF and Power BI Role Tech Lead Experience level Overall 8+, relavant 5+ ADF and databricks followed by Power BI Project Description min 50 words High proficiency and 8 10 years of experience in designing/developing data analytics and data warehouse solutions with Python and Azure Data Factory (ADF) and Azure Data Bricks. He/She tends to test code manually and does not utilize automated testing frameworks for Python and PySpark. While he has a foundational understanding of Spark architecture and the Spark execution model, he has limited experience in optimizing code based on Spark Monitoring features. Experience in designing large data distribution, integration with service-oriented architecture and/or data warehouse solutions, Data Lake solution using Azure Databricks with large and multi-format data Ability to translate working solution into implementable working package using Azure platform Good understanding on Azure storage Gen2 Hands on experience with Azure stack (minimum 5 years) + Azure Databricks + Azure Data Factory Proficient coding experience using Spark (Scala/Python), T-SQL Understanding around the services related to Azure Analytics, Azure SQL, Azure function app, logic app Should be able to demonstrate a constant and quick learning ability and to handle pressure situations without compromising on quality + Power BI Report development using PBI Analysis of SSRS reports. PBI data modelling experience is advantage. Work involved in report development and as well as migration od SSRS report. Must Have: [Power BI (cloud SaaS) / Paginated Report Builder / Power Query / Data modeling] Strong SQL scripting is required Well organized and able to manage multiple projects in a fast-paced demanding environment. Attention to detail and quality; excellent problem solving and communication skills. Ability and willingness to learn new tools and applications. Work Location with zip code Pune BGC Completion timeline: Before/post onboarding Before Onboarding Vendor billing Max. 13000 INR/Day

Posted 1 week ago

Apply

8.0 - 13.0 years

35 - 40 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Plans, conducts and leads assignments generally involving moderate, high budgets projects or more than one project. Manages user expectations regarding appropriate milestones and deadlines. Assists in training, work assignment and checking of less experienced developers. Serves as technical consultant to leaders in the IT organization and functional user groups. Subject matter expert in one or more technical programming specialties; employs expertise as a generalist of a specialist. Performs estimation efforts on complex projects and tracks progress. Works on the highest level of problems where analysis of situations or data requires an in-depth evaluation of various factors. Documents, evaluates and researches test results; documents evolution of testing scripts for future replication. Identifies, recommends and implements changes to enhance the effectiveness of quality assurance strategies. Description Comments Additional Details Description Comments : Skills: Python, PySpark and SQL 8+ years of experience in Spark, Scala, PySpark for big data processing Proficiency in Python programming for data manipulation and analysis. Experience with Python libraries such as Pandas, NumPy. Knowledge of Spark architecture and components (RDDs, DataFrames, Spark SQL). Strong knowledge of SQL for querying databases. Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. Ability to write complex SQL queries for data extraction and transformation. Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues. Strong problem-solving skills to address data-related challenges Effective communication skills to collaborate with cross-functional teams.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Collaborate with other teams to understand data requirements and deliver solutions. Design, develop, and maintain scalable data pipelines using Python and PySpark. Utilize PySpark and Spark scripting for data processing and analysis Implement ETL (Extract, Transform, Load) processes to ensure data is accurately processed and stored. Develop and maintain Power BI reports and dashboards. Optimize data pipelines for performance and reliability. Integrate data from various sources into centralized data repositories. Ensure data quality and consistency across different data sets. Analyze large data sets to identify trends, patterns, and insights. Optimize PySpark applications for better performance and scalability. Continuously improve data processing workflows and infrastructure. Not to Exceed Rate : (No Value)

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Target s investments in technology and innovation. We re the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guests and we do so with a focus on diversity and inclusion, experimentation and continuous learning. At Target, we are gearing up for exponential growth and continuously expanding our guest experience. To support this expansion, Data Engineering is building robust warehouses and enhancing existing datasets to meet business needs across the enterprise. We are looking for talented individuals who are passionate about innovative technology, data warehousing and are eager to contribute to data engineering. . Position Overview Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams. Analyze technical issues and questions identifying data needs and delivery mechanisms Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion. Manage and execute against agile plans and set deadlines based on client, business, and technical requirements Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations Ensure all code adheres to development & security standards About you 4 year degree or equivalent experience 5+ years of software development experience preferably in data engineering/Hadoop development (Hive, Spark etc.) Hands on Experience in Object Oriented or functional programming such as Scala / Java / Python Knowledge or experience with a variety of database technologies (Postgres, Cassandra, SQL Server) Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns Experience with cloud platforms like Google Cloud, AWS, or Azure. Hands on Experience on BigQuery will be an added advantage Good understanding of distributed storage(HDFS, Google Cloud Storage, Amazon S3) and processing(Spark, Google Dataproc, Amazon EMR or Databricks) Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus Familiarity with data warehousing concepts and technologies. Maintains technical knowledge within areas of expertise Constant learner and team player who enjoys solving tech challenges with global team. Hands on experience in building complex data pipelines and flow optimizations Be able to understand the data, draw insights and make recommendations and be able to identify any data quality issues upfront Experience with test-driven development and software test automation Follow best coding practices & engineering guidelines as prescribed Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Lead Data Analyst About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. As a part of the Merchandising Analytics and Insights team, our analysts work closely with business owners as well as technology and data product teams staffed with product owners and engineers. They support all Merchandising strategic initiatives with data, reporting and analysis. Merchandising teams rely on this team of analysts to bring data to support decision making. PRINCIPAL DUTIES AND RESPONSIBILITIES As a Lead Data Analyst your responsibilities will be exploring data, technologies, and the application of mathematical techniques to derive business insights. Data analysts spend their time determining the best approach to gather, model, manipulate, analyze and present data. You will lead an agile team which requires active participation in ceremonies and team meetings. In this role, you ll have opportunities to continuously upskill to stay current with new technologies in the industry via formal training, peer training groups and self-directed education. This role specifically will support a new initiatives requiring new data and metric development. Your curiosity and ability to roll up guest level insights will be critical. The team will leveraging in store analytic data to measure the impact of promotions, product placement, or physical changes at any given store has on store operations, guest experience and purchase decisions. This new capability are still being defined, which allows for creative thinking and leadership opportunities. Job duties may change at any time due to business needs. Key Responsibilities *: Work with MC, Merch, Planning, Marketing, Digital, etc. teams to use data to identify opportunities, bridge gaps, and build advanced capabilities Partner with Product, DS, DE, etc. to determine the best approach to gather, model, manipulate, analyze and present data Develop data and metrics to support key business strategies, initiatives, and decisions Explore data, technologies, and the application of mathematical techniques to derive business insights Desired Skills & Experiences *: Ability to breakdown complex problems, identify root cause of the issue, and develop sustainable solutions Ability to influence cross-functional teams and partners at multiple levels of the organization Possess analytical skills (SQL, Python, R, etc.) to find, manipulate, and present data in meaningful ways to clients Desire to continuously upskill to stay current with new technologies in the industry via formal training, peer training groups and self-directed education About you (In terms of technical) B.E/B.Tech, M.Tech, M.Sc. , MCA - Overall 8-12 years of experience. And 6-8 years data ecosystem experience. Strong Architect of data capabilities and analytical tools. Proven experience to architect enterprise level Datawarehouse solutions and BI Implementations across Domo, Tableau & other Visualization tools. Provide expertise and ability to train and guide team to implement top design architectures to build next generation analytics Deep Big data experience. Should have solid experience in Hadoop ecosystem and its components around writing programs using Map-Reduce, experience in developing Hive and PySpark SQL and designing and developing Oozie workflows. Hands on experience in object oriented or functional programming such as Scala &/or Python/R or other open-source languages Strong foundational mathematics and statistics Experience in analytical techniques like Linear & Non-Linear Regression, Logistic Regression, Time-series models, Classification Techniques etc (In terms of soft skills for lead role) Strong stakeholder management with product teams and business leaders. Has strong problem solving, analytical skills and ability to manage ambiguity. Ability to communicate results of complex analytic findings to both technical /non-technical audiences and business leaders. Ability to lead change, work through conflict and setbacks . Experience working in an agile environment (stories, backlog refinement, sprints, etc.). Excellent attention to detail and timelines. Strong sense of ownership. Desire to continuously upskill to stay current with new technologies in the industry via formal training, peer training groups and self-directed education Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. Pyramid overview A role with Target Data Science & Engineering means the chance to help develop and manage state of the art predictive algorithms that use data at scale to automate and optimize decisions at scale. Whether you join our Statistics, Optimization or Machine Learning teams, you ll be challenged to harness Target s impressive data breadth to build the algorithms that power solutions our partners in Marketing, Supply Chain Optimization, Network Security and Personalization rely on. Position Overview As a Senior Engineer on the Search Team , you serve as a specialist in the engineering team that supports the product. You help develop and gain insight in the application architecture. You can distill an abstract architecture into concrete design and influence the implementation. You show expertise in applying the appropriate software engineering patterns to build robust and scalable systems. You are an expert in programming and apply your skills in developing the product. You have the skills to design and implement the architecture on your own, but choose to influence your fellow engineers by proposing software designs, providing feedback on software designs and/or implementation. You leverage data science in solving complex business problems. You make decisions based on data. You show good problem solving skills and can help the team in triaging operational issues. You leverage your expertise in eliminating repeat occurrences. About You 4-year degree in Quantitative disciplines (Science, Technology, Engineering, Mathematics) or equivalent experience Experience with Search Engines like SOLR and Elastic Search Strong hands-on programming skills in Java, Kotlin, Micronaut, Python, Experience on Pyspark, SQL, Hadoop/Hive is added advantage Experience on streaming systems like Kakfa. Experience on Kafka Streams is added advantage. Experience in MLOps is added advantage Experience in Data Engineering is added advantage Strong analytical thinking skills with an ability to creatively solve business problems, innovating new approaches where required Able to produce reasonable documents/narrative suggesting actionable insights Self-driven and results oriented Strong team player with ability to collaborate effectively across geographies/time zones Know More About Us here: Life at Target - https://india.target.com/ Benefits - https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

4.0 - 9.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. Pyramid overview A role with Target Data Science & Engineering means the chance to help develop and manage state of the art predictive algorithms that use data at scale to automate and optimize decisions at scale. Whether you join our Statistics, Optimization or Machine Learning teams, you ll be challenged to harness Target s impressive data breadth to build the algorithms that power solutions our partners in in Marketing, Supply Chain Optimization, Network Security and Personalization rely on Every Scientist on Target s Data Sciences team can expect modeling and data science, software/product development of highly performant code for Model Performance, and to elevate Target s culture and apply retail domain knowledge. About the role As a Lead Data Scientist, you ll influence by interacting with the Data Sciences team, Product teams, Scientist/Engineer individual contributors from other pillars, and business partners. You will perform within the scale and scope of your role by defining solutions and beginning to identify problems to solve and contribute to Data Sciences and Target s culture by modeling and contributing to the culture. You ll get the opportunity to use your expertise in one or more of the following areasmachine learning, probability theory & statistics, optimization theory, Simulation, Econometrics, Deep Learning, Natural Language processing or computer vision. We will look to you to own design and implementation of an algorithmic solution (e.g., recommendation or forecasting algorithm), including data understanding, feature engineering, model development, validation and testing, and deployment to a production environment. You ll drive development of problem statements that capture the business considerations, define metrics/measurement to validate model performance, and drive feasibility study with data requirements and potential solutions approaches to be considered. You ll evaluate tradeoffs of simple vs complex models/solutions in determining the right technique to employ for a business problem and develop and maintain a nuanced understanding of the data generated by the business, including fundamental limitations of the data. You ll leverage your proficiency in one or more approved programming languages (Java, Scala, Python, R), and ensure foundational programming principles in developing code (best practices, writes unit tests, code organization, basics of CI/CD etc.) are followed in developing the team s products/models. You ll not only stitch together basic data pipelines for a given problem and own design and implementation of individual components within Data Science/Tech applications, but also articulate the technical strategy, value of technology, and impact on the business. As you do so, you ll collaborate with engineers, scientists, and business partners/product owners to create algorithmic solutions that are performant and integrated into applications. We ll look to you to mentor and provide technical support within a team, including mentoring junior team members, and present your work and your team s work to business partners and other Data Sciences teams. With a deeper understanding of your functional area of responsibility, you ll support agile ceremonies, collaborate with peers across multiple products, communicate and collaborate with business partners, and demonstrate an understanding of areas outside your scope of responsibility. The exciting part of retailIt s always changing! Core responsibilities of this job are described within this job description. Job duties may change at any time due to business needs. About you: 4-year degree in Quantitative disciplines (Science, Technology, Engineering, Mathematics) and 6+ years of professional experience or equivalent industry experience Master s degree in Quantitative disciplines (Science, Technology, Engineering, Mathematics) Good knowledge and experience developing optimization, simulation, and statistical models Strong analytical thinking skills. Ability to creatively solve business problems, innovating new approaches where required Strong hands-on programming skills in Python, SQL, Hadoop/Hive. Additional knowledge of Spark, Scala, R, Java desired but not mandatory Good working knowledge of mathematical and statistical concepts, MILP, algorithms, and computational complexity Passion for solving interesting and relevant real-world problems using a data science approach Experience in implementing advanced statistical techniques like regression, clustering, PCA, forecasting (time series) etc. Able to produce reasonable documents/narrative suggesting actionable insights Excellent communication skills. Ability to clearly tell data driven stories through appropriate visualizations, graphs, and narratives Self-driven and results oriented; able to meet tight timelines Strong team player with ability to collaborate effectively across geographies/time zones Know More About us here: Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. At Target, we have a timeless purpose and a proven strategy and that hasn t happened by accident. Some of the best minds from diverse backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values diverse backgrounds. We believe your unique perspective is important, and you'll build relationships by being authentic and respectful. At Target, inclusion is part of the core value. We aim to create equitable experiences for all, regardless of their dimensions of difference. As an equal opportunity employer, Target provides diverse opportunities for everyone to grow and win About us Working at Target means helping all families discover the joy of everyday life. We bring that vision to life through our values and culture. Learn more about Target here . As a Senior Engineer, you serve as the technical anchor for the engineering team that supports a product. You create, own and are responsible for the application architecture that best serves the product in its functional and non-functional needs. You identify and drive architectural changes to accelerate feature development or improve the quality of service (or both). You have deep and broad engineering skills and are capable of standing up an architecture in its whole on your own, but you choose to influence a wider team by acting as a force multiplier . About this Team IT Data Platform (ITDP) is the powerhouse data platform driving Target's tech efficiencies, seamlessly integrating operational and analytical needs. It fuels every facet of Target Tech, from boosting developer productivity and enhancing system intelligence to ensuring top-notch security and compliance. Target Tech builds the technology that makes Target the easiest, safest and most joyful place to shop and work. From digital to supply chain to cybersecurity, develop innovations that power the future of retail while relying on best-in-class data science algorithms that drive value. Target Tech is at the forefront of the industry, revolutionizing technology efficiency with cutting-edge data and AI. ITDP meticulously tracks tech data points across stores, multi-cloud environments, data centers, and distribution centers. IT Data Platform leverages advanced AI algorithms to analyze vast datasets, providing actionable insights that drive strategic decision-making. By integrating Generative AI, it enhances predictive analytics, enabling proactive solutions and optimizing operational efficiencies. Basic Qualifications 4 years degree or equivalent experience 3+ years of industry experience in software design, development, and algorithm related solutions. 3+ years of experience in programming languages such as Java, Python, Scala. Hands on experience developing distributed systems, large scale systems, database and/or backend APIs. Demonstrates experience in analysis and optimization of systems capacity, performance, and operational health Preferred Qualifications Experience Big Data tools and Hadoop Ecosystems. Like Apache Spark, Apache Iceberg, Kafka, ORC, MapReduce, Yarn, Hive, HDFS etc. Experience in developing and running a large-scale system. Experience with industry, open-source projects and/or databases and/or large-data distributed systems. Key Responsibilities Data Platform ManagementDesign, implementation, and optimization of the Data Platform ensuring scalability and data correctness. DevelopmentOversee the development and maintenance of all core components of the platform. Unified APIsImplementation of highly scalable APIs with GraphQL/REST at enterprise scale. Platform Monitoring and ObservabilityEnsure monitoring solutions and security tools to ensure the integrity and trust in Data and APIs. Leadership and MentorshipProvide technical leadership and mentorship to junior engineers, fostering a culture of collaboration and continuous improvement. Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/diversity-and-inclusion

Posted 1 week ago

Apply

3.0 - 5.0 years

32 - 40 Lacs

Pune

Work from Office

Naukri logo

: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 week ago

Apply

4.0 - 9.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About Us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Pyramid Overview As a Sr Engineer in Data Sciences , y ou will play crucial role in designing, implementing, and optimizing the AI solutions in production. Additionally, you ll apply best practices in software design, participate in code reviews, create a maintainable well-tested codebase with relevant documentation. We will look to you to understand and actively follow foundational programming principles (best practices, know about unit tests, code organization, basics of CI/CD etc.) and create a well-maintainable & tested codebase with relevant documentation. You will get the opportunity to develop in one or more approved programming languages (Java, Scala, Python, R), and learn and adhere to best practices in data analysis and data understanding. Team Overview The Competitive Intelligence Data Sciences team at Target builds Data Science models in service of leveraging competitive information for various decisioning . The team plays a crucial role in helping Target stay competitive across various business functions. Position Overview For this specific role, you will be responsible for deploying and maintaining language models i n partnership with the cross functional technology team that spans across Product engineering, Data engineering & Data analytics . You will need to diagnose model performance, review summary statistics and iteratively improve the quality of the downstream decisions. About You: 4-year degree in Quantitative disciplines (Science, Technology, Engineering, Mathematics) or equivalent experience MS in Computer Science, Applied Mathematics, Statistics, Physics or equivalent work or industry experience 4 plus years of experience in end-to-end application development, data exploration, data pipelining, API design, optimization of model latency Strong expertise in working with text data, embeddings, building & deploying NLP solutions and integrating with LLM services Expertise in MLOps frameworks and hands on experience in MLOps tools 2 plus years of experience building and deploying AI/ML algorithms into production environments - including model and system monitoring and troubleshooting Highly proficient in programming with Spark (Python and/or Scala) Good understanding of Big Data and Distributed Architecture- specifically Hadoop, Hive, Spark, Docker, Kubernetes and Kafka Excellent communication skills with the ability to clearly tell data driven stories through appropriate visualizations, graphs, and narratives Self-driven and results oriented; able to meet tight timelines Strong team player with ability to collaborate effectively across global team Understanding of retail industry is added advantage Bonus Points: Experience with Deep Learning frameworks TensorFlow, Pytorch or Keras Experience developing highly distributed AI/ML systems at scale Experience with Vertex AI Experience in mentoring the junior team members ML skillset and career development Experience in handling streaming data and setting up real-time services Know More about Us here: Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

Exploring Scala Jobs in India

Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.

Average Salary Range

The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead

As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.

Related Skills

In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts

Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.

Interview Questions

Here are 25 interview questions that you may encounter when applying for Scala roles:

  • What is Scala and why is it used? (basic)
  • Explain the difference between val and var in Scala. (basic)
  • What is pattern matching in Scala? (medium)
  • What are higher-order functions in Scala? (medium)
  • How does Scala support functional programming? (medium)
  • What is a case class in Scala? (basic)
  • Explain the concept of currying in Scala. (advanced)
  • What is the difference between map and flatMap in Scala? (medium)
  • How does Scala handle null values? (medium)
  • What is a trait in Scala and how is it different from an abstract class? (medium)
  • Explain the concept of implicits in Scala. (advanced)
  • What is the Akka toolkit and how is it used in Scala? (medium)
  • How does Scala handle concurrency? (advanced)
  • Explain the concept of lazy evaluation in Scala. (advanced)
  • What is the difference between List and Seq in Scala? (medium)
  • How does Scala handle exceptions? (medium)
  • What are Futures in Scala and how are they used for asynchronous programming? (advanced)
  • Explain the concept of type inference in Scala. (medium)
  • What is the difference between object and class in Scala? (basic)
  • How can you create a Singleton object in Scala? (basic)
  • What is a higher-kinded type in Scala? (advanced)
  • Explain the concept of for-comprehensions in Scala. (medium)
  • How does Scala support immutability? (medium)
  • What are the advantages of using Scala over Java? (basic)
  • How do you implement pattern matching in Scala? (medium)

Closing Remark

As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies