Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
vadodara, gujarat
On-site
Job Description: You will apply Data Modelling, Data Mining and Data Segmentation techniques on the available data to identify and Analyze trends or patterns in complex data sets using statistical methods. You will interpret the data based on the analyzed trends and patterns, collaborate with the management to prioritize business requirements, and identify improvement opportunities in the business processes. Key Responsibilities: - Apply Data Modelling, Data Mining and Data Segmentation techniques on the available data - Identify and Analyze trends or patterns in complex data sets using statistical methods - Interpret the data based on the analyzed trends and patterns - Collaborate with the management to prioritize business requirements - Identify improvement opportunities in the business processes Qualifications Required: - Experience of 3-5 years as a Data Analyst - Experience in data-driven problem solving - Excellent in Data modelling, Data Mining and Data Segmentation Methods - Excellent analytical, business modelling, and problem-solving skills - Advanced Microsoft Excel skills - Experience of working with MS SQL - Knowledge of R Programming - Experience of using XML, ETL frameworks and JavaScript - Excellent in creating queries and reports - Good Knowledge of Microsoft office - Excellent communications skills (Verbal and Written),
Posted 22 hours ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Technical Business Analyst with 4-8 years of experience in the Banking domain, your role will involve the following responsibilities: - Write Functional Specifications based on Business Requirements - Ensure clear understanding, documentation, and translation of requirements to the development team - Act as the primary contact to address queries from the development team - Guarantee adequate test coverage and ensure high-quality artifacts before user delivery - Serve as the primary or secondary point of contact for User Acceptance Testing (UAT) and other tests - Define test plans/test cases and conduct effective testing when necessary - Manage agile master activities within the team - Possess a solid understanding of data visualization tools - Proficiency in generating reports and dashboards using Power BI, PowerPoint presentations, and Excel spreadsheets - Extensive expertise in reporting tools and programming languages such as Python, ETL frameworks, SQL, and Macros - Good knowledge of Databases for data analysis, modeling, interpretation, and connection to BI tools - Assist in addressing and elaborating on inquiries from end recipients, focusing on data cleanup, processing, and analysis - Promptly inform or escalate any issues identified during the reporting process to superiors or management - Define, monitor, and report on Key Performance Indicators (KPIs) and Key Risk Indicators (KRIs) related to essential tasks and activities - Identify and define opportunities for process improvements - Collaborate with management to prioritize business and informational requirements - Demonstrate excellent verbal and written communication skills - Exhibit strong interpersonal abilities - Capable of effective work planning and meeting deadlines - Maintain a high level of accuracy and attention to detail Qualifications Required: - 4-8 years of experience in the Banking domain - Experience in performing Technical Business Analyst role - Proficiency in MS Office, SharePoint, Agile methodology, SQL, PowerBI/Tableau Joining this company offers you the opportunity to work in a cross-cultural team with multi-geographical locations. You will have the chance to communicate effectively, both verbally and in writing, with counterparts in SG Paris.,
Posted 22 hours ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Data Engineer, Senior Consultant for the Asia Pacific Region based out of Bangalore, you will bring deep expertise in data architecture, big data/data warehousing, and the ability to build large-scale data processing systems using the latest database and data processing technologies. Your role will be crucial in enabling VISA's clients in the region with foundational data capabilities needed to scale their data ecosystem for next-generation portfolio optimization and hyper-personalized marketing, especially within the BFSI space. You will work closely with VISA's market teams in AP, acting as a bridge between end-users and technology colleagues in Bangalore and the US to influence the development of global capabilities while providing local tools and technologies as required. Your consulting, communication, and presentation skills will be essential for collaborating with clients and internal cross-functional team members at various levels. You will also work on strategic client projects using VISA data, requiring proficiency in hands-on detailed design and coding using big data technologies. Key Responsibilities: - Act as a trusted advisor to VISA's clients, offering strategic guidance on designing and implementing scalable data architectures for advanced analytics and marketing use cases. - Collaborate with senior management, business units, and IT teams to gather requirements, align data strategies, and ensure successful adoption of solutions. - Integrate diverse data sources in batch and real-time to create a consolidated view such as a single customer view. - Design, develop, and deploy robust data platforms and pipelines leveraging technologies like Hadoop, Spark, modern ETL frameworks, and APIs. - Ensure data solutions adhere to client-specific governance and regulatory requirements related to data privacy, security, and quality. - Design target platform components, data flow architecture, and capacity requirements for scalable data architecture implementation. - Develop and deliver training materials, documentation, and workshops to upskill client teams and promote data best practices. - Review scripts for best practices, educate user base, and build training assets for beginner and intermediate users. Qualifications: - Bachelor's degree or higher in Computer Science, Engineering, or a related field. - 12+ years of progressive experience in data advisory, data architecture & governance, and data engineering roles. - Good understanding of Banking and Financial Services domains with familiarity in enterprise analytics data assets. - Experience in client consulting on data architecture and engineering solutions translating business needs into technical requirements. - Expertise in distributed data architecture, modern BI tools, and frameworks/packages used for Generative AI and machine learning model development. - Strong resource planning, project management, and delivery skills with a track record of successfully leading or contributing to large-scale data initiatives. (Note: Additional information section omitted as no details provided in the job description),
Posted 3 days ago
0.0 years
0 Lacs
hyderabad, telangana, india
On-site
Role Summary & Role Description: Technical Manager with specific Oracle, PL/SQL and design, develop, and optimize data workflows on the Databricks platform. The ideal candidate will have deep expertise in Apache Spark, PySpark, Python, job orchestration, and CI/CD integration to support scalable data engineering and analytics solutions. Analyzes, designs, develops and maintains software applications to support business units. Expected to spend 80% of the time on hands-on development, design and architecture and remaining 20% on guiding the team on technology and removing other impediments Capital Markets Projects experience preferred Provides advanced technical expertise in analyzing, designing, estimating, and developing software applications to project schedule. Oversees systems design and implementation of most complex design components. Creates project plans and deliverables and monitors task deadlines. Oversees, maintains and supports existing software applications. Provides subject matter expertise in reviewing, analyzing, and resolving complex issues. Designs and executes end to end system tests of new installations and/or software prior to release to minimize failures and impact to business and end users. Responsible for resolution, communication, and escalation of critical technical issues. Prepares user and systems documentation as needed. Identifies and recommends Industry best practices. Serves as a mentor to junior staff. Acts as a technical lead/mentor for developers in day to day and overall project areas. Ability to lead a team of agile developers. Worked in a complex deadline driven projects with minimal supervision. Ability to architect/design/develop with minimum requirements by effectively coordinating activities between business analysts, scrum leads, developers and managers. Ability to provide agile status notes on day to day project tasks. Key Responsibilities: Data Lakehouse Development: Design and implement scalable data Lakehouse solutions using Databricks and Delta Lake . Develop and maintain Delta Lake tables with ACID transactions and schema evolution. Data Ingestion & Autoloaders: Build ingestion pipelines using Databricks Autoloader for real-time and batch data. Integrate data from various sources including cloud storage (e.g., ADLS, S3), databases, APIs, and streaming platforms. ETL & Data Enrichment: Develop ETL workflows to cleanse, transform, and enrich raw data. Implement business logic and data quality checks to ensure reliability and accuracy. Performance Optimization: Optimize Spark jobs for performance and cost-efficiency. Monitor and troubleshoot data pipelines using Databricks tools and logging frameworks. Data Access & Governance: Enable secure and governed access to data using Unity Catalog or other access control mechanisms. Collaborate with data analysts and scientists to ensure data availability and usability. Collaboration & Documentation: Work closely with cross-functional teams including data architects, analysts, and business stakeholders. Document data models, pipeline designs, and operational procedures. Technical Skills: Design and implement robust ETL pipelines using Databricks notebooks and workflows. Strong proficiency in Apache Spark, PySpark, and Databricks. Hands-on experience with Delta Lake, Autoloader, and Structured Streaming. Proficiency in SQL, Python, and cloud platforms (Azure, AWS, or GCP). Experience with ETL frameworks, data modeling, and data warehousing concepts. Familiarity with CI/CD, Git, and DevOps practices in data engineering. Knowledge of data governance, security, and compliance standards. Optimize Spark jobs for performance and cost-efficiency. Develop and manage job orchestration strategies using Databricks Jobs and Workflows. Monitor and troubleshoot production jobs, ensuring reliability and data quality. Implement security and governance best practices including access control and encryption. Strong Practical experience using Scrum, Agile modelling and adaptive software development. Ability to understand and grasp the big picture of system components. Experience building environment and architecture and design guides and architecture and application blueprints. Strong understanding of data modeling, warehousing, and performance tuning. Excellent problem-solving and communication skills. Core/Must have skills: Oracle, SQL, PLSQL, Python, Scala, Apache Spark, Spark Streaming, CI CD pipeline, AWS cloud experience Exposure to real-time data processing and event-driven architectures . Good to have skills Databricks certifications (e.g., Data Engineer Associate/Professional). Experience with Unity Catalog , MLflow , or Delta Live Tables Work Schedule: 12 PM IST to 9 PM (IST) Why this role is important to us: Our technology function, Global Technology Services (GTS), is vital to State Street and is the key enabler for our business to deliver data and insights to our clients. We're driving the company's digital transformation and expanding business capabilities using industry best practices and advanced technologies such as cloud, artificial intelligence and robotics process automation. We offer a collaborative environment where technology skills and innovation are valued in a global organization. We're looking for top technical talent to join our team and deliver creative technology solutions that help us become an end-to-end, next-generation financial services company. Join us if you want to grow your technical skills, solve real problems and make your mark on our industry. About State Street: What we do. State Street is one of the largest custodian banks, asset managers and asset intelligence companies in the world. From technology to product innovation, we're making our mark on the financial services industry. For more than two centuries, we've been helping our clients safeguard and steward the investments of millions of people. We provide investment servicing, data & analytics, investment research & trading and investment management to institutional clients. Work, Live and Grow. We make all efforts to create a great work environment. Our benefits packages are competitive and comprehensive. Details vary by location, but you may expect generous medical care, insurance and savings plans, among other perks. You'll have access to flexible Work Programs to help you match your needs. And our wealth of development programs and educational support will help you reach your full potential. Inclusion, Diversity and Social Responsibility. We truly believe our employees diverse backgrounds, experiences and perspectives are a powerful contributor to creating an inclusive environment where everyone can thrive and reach their maximum potential while adding value to both our organization and our clients. We warmly welcome candidates of diverse origin, background, ability, age, sexual orientation, gender identity and personality. Another fundamental value at State Street is active engagement with our communities around the world, both as a partner and a leader. You will have tools to help balance your professional and personal life, paid volunteer days, matching gift programs and access to employee networks that help you stay connected to what matters to you. State Street is an equal opportunity and affirmative action employer. Discover more at StateStreet.com/careers
Posted 1 week ago
0.0 years
0 Lacs
hyderabad, telangana, india
On-site
Role Summary & Role Description: Technical Manager with specific Oracle, PL/SQL and design, develop, and optimize data workflows on the Databricks platform. The ideal candidate will have deep expertise in Apache Spark, PySpark, Python, job orchestration, and CI/CD integration to support scalable data engineering and analytics solutions. Analyzes, designs, develops and maintains software applications to support business units. Expected to spend 80% of the time on hands-on development, design and architecture and remaining 20% on guiding the team on technology and removing other impediments Capital Markets Projects experience preferred Provides advanced technical expertise in analyzing, designing, estimating, and developing software applications to project schedule. Oversees systems design and implementation of most complex design components. Creates project plans and deliverables and monitors task deadlines. Oversees, maintains and supports existing software applications. Provides subject matter expertise in reviewing, analyzing, and resolving complex issues. Designs and executes end to end system tests of new installations and/or software prior to release to minimize failures and impact to business and end users. Responsible for resolution, communication, and escalation of critical technical issues. Prepares user and systems documentation as needed. Identifies and recommends Industry best practices. Serves as a mentor to junior staff. Acts as a technical lead/mentor for developers in day to day and overall project areas. Ability to lead a team of agile developers. Worked in a complex deadline driven projects with minimal supervision. Ability to architect/design/develop with minimum requirements by effectively coordinating activities between business analysts, scrum leads, developers and managers. Ability to provide agile status notes on day to day project tasks. Key Responsibilities: Data Lakehouse Development: Design and implement scalable data Lakehouse solutions using Databricks and Delta Lake . Develop and maintain Delta Lake tables with ACID transactions and schema evolution. Data Ingestion & Autoloaders: Build ingestion pipelines using Databricks Autoloader for real-time and batch data. Integrate data from various sources including cloud storage (e.g., ADLS, S3), databases, APIs, and streaming platforms. ETL & Data Enrichment: Develop ETL workflows to cleanse, transform, and enrich raw data. Implement business logic and data quality checks to ensure reliability and accuracy. Performance Optimization: Optimize Spark jobs for performance and cost-efficiency. Monitor and troubleshoot data pipelines using Databricks tools and logging frameworks. Data Access & Governance: Enable secure and governed access to data using Unity Catalog or other access control mechanisms. Collaborate with data analysts and scientists to ensure data availability and usability. Collaboration & Documentation: Work closely with cross-functional teams including data architects, analysts, and business stakeholders. Document data models, pipeline designs, and operational procedures. Technical Skills: Design and implement robust ETL pipelines using Databricks notebooks and workflows. Strong proficiency in Apache Spark, PySpark, and Databricks. Hands-on experience with Delta Lake, Autoloader, and Structured Streaming. Proficiency in SQL, Python, and cloud platforms (Azure, AWS, or GCP). Experience with ETL frameworks, data modeling, and data warehousing concepts. Familiarity with CI/CD, Git, and DevOps practices in data engineering. Knowledge of data governance, security, and compliance standards. Optimize Spark jobs for performance and cost-efficiency. Develop and manage job orchestration strategies using Databricks Jobs and Workflows. Monitor and troubleshoot production jobs, ensuring reliability and data quality. Implement security and governance best practices including access control and encryption. Strong Practical experience using Scrum, Agile modelling and adaptive software development. Ability to understand and grasp the big picture of system components. Experience building environment and architecture and design guides and architecture and application blueprints. Strong understanding of data modeling, warehousing, and performance tuning. Excellent problem-solving and communication skills. Core/Must have skills: Oracle, SQL, PLSQL, Python, Scala, Apache Spark, Spark Streaming, CI CD pipeline, AWS cloud experience Exposure to real-time data processing and event-driven architectures . Good to have skills Databricks certifications (e.g., Data Engineer Associate/Professional). Experience with Unity Catalog , MLflow , or Delta Live Tables Work Schedule: 12 PM IST to 9 PM (IST) Why this role is important to us: Our technology function, Global Technology Services (GTS), is vital to State Street and is the key enabler for our business to deliver data and insights to our clients. We're driving the company's digital transformation and expanding business capabilities using industry best practices and advanced technologies such as cloud, artificial intelligence and robotics process automation. We offer a collaborative environment where technology skills and innovation are valued in a global organization. We're looking for top technical talent to join our team and deliver creative technology solutions that help us become an end-to-end, next-generation financial services company. Join us if you want to grow your technical skills, solve real problems and make your mark on our industry. About State Street: What we do. State Street is one of the largest custodian banks, asset managers and asset intelligence companies in the world. From technology to product innovation, we're making our mark on the financial services industry. For more than two centuries, we've been helping our clients safeguard and steward the investments of millions of people. We provide investment servicing, data & analytics, investment research & trading and investment management to institutional clients. Work, Live and Grow. We make all efforts to create a great work environment. Our benefits packages are competitive and comprehensive. Details vary by location, but you may expect generous medical care, insurance and savings plans, among other perks. You'll have access to flexible Work Programs to help you match your needs. And our wealth of development programs and educational support will help you reach your full potential. Inclusion, Diversity and Social Responsibility. We truly believe our employees diverse backgrounds, experiences and perspectives are a powerful contributor to creating an inclusive environment where everyone can thrive and reach their maximum potential while adding value to both our organization and our clients. We warmly welcome candidates of diverse origin, background, ability, age, sexual orientation, gender identity and personality. Another fundamental value at State Street is active engagement with our communities around the world, both as a partner and a leader. You will have tools to help balance your professional and personal life, paid volunteer days, matching gift programs and access to employee networks that help you stay connected to what matters to you. State Street is an equal opportunity and affirmative action employer. Discover more at StateStreet.com/careers
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
Qualcomm India Private Limited is currently looking for a dedicated Staff IT Data Engineer to become a part of the Enterprise Architecture, Data and Services (EADS) Team. In this role, you will be responsible for designing, developing, and supporting advanced data pipelines within a multi-cloud environment, with a primary focus on Databricks. Your main duties will include facilitating data ingestion from various sources, processing data using a standardized data lake approach, and provisioning data for analytical purposes. The position requires a strong implementation of DevSecOps practices, including Continuous Integration/Continuous Deployment (CI/CD), Infrastructure as Code (IaC), and automated testing to improve data operations, monitor and optimize data loads, and establish data retention policies. As a qualified candidate, you should possess experience in designing, architecting, and presenting data systems for customers, with a minimum of 5 years of experience in data engineering, architecture, or analytics roles. Proficiency in the Databricks platform, including cluster management, Delta Lake, MLFlow, and Unity catalog with Collibra integration is essential. Additionally, expertise in Spark, Python, SQL, and ETL frameworks, as well as experience with cloud platforms like AWS and their integration with Databricks, is required. A deep understanding of data warehousing concepts, big data processing, and real-time analytics is also crucial for this role. To be considered for this position, you must hold a Bachelor's degree in computer engineering, Computer Science, Information Systems, or a related field, with a minimum of 5 years of IT-related work experience, or 7+ years of IT-related work experience without a Bachelor's degree. Strong programming experience, preferably in Python or Java, along with significant experience in Databricks and SQL or NoSQL Databases, is mandatory. Familiarity with data modeling, CI/CD, DevSecOps, and other related technologies will be an advantage. Preferred qualifications for this role include being a Databricks Certified Data Engineer or Databricks-AWS platform architect, experience with cloud platforms like Azure and GCP, and knowledge of Big Data handling with PySpark. A Bachelor of Science in Computer Science, Information Technology, Engineering, or a related field, or equivalent professional experience is required. Qualcomm India Private Limited is an equal opportunity employer and is committed to providing accessible processes for individuals with disabilities. If you require accommodations during the application/hiring process, please contact Qualcomm via email at disability-accommodations@qualcomm.com or through their toll-free number. Please note that only individuals seeking a position directly at Qualcomm are permitted to use their Careers Site; staffing and recruiting agencies should refrain from submitting profiles, applications, or resumes.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior AWS Data Engineer Vice President at Barclays, you will play a pivotal role in shaping the future by spearheading the evolution of the Reference Data Services function. You will work collaboratively with a team of engineers to oversee the engineering process from strategy and design to build, documentation, and testing of software components. Your responsibilities will extend to working closely with colleagues in the cloud and middleware organization to bring products to life in a modern developer-focused environment. Effective stakeholder management, leadership, and decision-making skills are essential to support business strategy and risk management. To excel in this role, you should possess experience in AWS cloud services like S3, Glue, Athena, Lake Formation, and CloudFormation. Proficiency in Python at a senior level for data engineering and automation purposes is crucial. Additionally, familiarity with ETL frameworks, data transformation, and data quality tools will be beneficial. Highly valued skills may include an AWS Data Engineer certification, prior experience in the banking or financial services domain, expertise in IAM and Permissions management in AWS cloud, and proficiency with tools such as Databricks, Snowflake, Starburst, and Iceberg. Your primary goal will be to build and maintain systems that collect, store, process, and analyze data effectively, ensuring accuracy, accessibility, and security. This involves developing data architectures pipelines, designing and implementing data warehouses and data lakes, and creating processing and analysis algorithms tailored to the data complexity and volumes. Collaboration with data scientists to build and deploy machine learning models will also be part of your responsibilities. As a Vice President, you are expected to contribute to setting strategy, driving requirements, and making recommendations for change. Managing resources, budgets, and policies, delivering continuous improvements, and addressing policy breaches are key aspects of this role. If you have leadership responsibilities, you are required to demonstrate leadership behaviors that create an environment for colleagues to thrive and deliver excellent results consistently. For individual contributors, being a subject matter expert within your discipline, guiding technical direction, leading collaborative assignments, and mentoring less experienced specialists are essential. Your role will be based in Pune, India, and you will collaborate with key stakeholders, functional leadership teams, and senior management to provide insights on functional and cross-functional areas of impact and alignment. Managing and mitigating risks through assessment, demonstrating leadership in managing risks, and strengthening controls will be critical. You are expected to have a comprehensive understanding of organizational functions to contribute to achieving business goals. Building and maintaining relationships with internal and external stakeholders, using influencing and negotiating skills to achieve outcomes, is also part of your responsibilities. All colleagues are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, along with the Barclays Mindset to Empower, Challenge, and Drive. These values and mindset serve as our moral compass and operating manual for behavior.,
Posted 2 weeks ago
0.0 years
0 Lacs
pune, maharashtra, india
On-site
Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process - through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services) Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects Experience with time-series/analytics dB's such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF's for analysing semi structured/structured datasets Location : Pune and Bangalore You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSDI
Posted 2 weeks ago
10.0 - 12.0 years
0 Lacs
bengaluru, karnataka, india
On-site
We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. SAP Ariba is seeking a highly skilled Senior Developer to drive the success of our advanced analytics-based applications. In this role, you will be responsible for defining the architecture, strategy, and technical vision for next-generation analytics solutions that are scalable, intelligent, and highly interactive What you'll do: You will collaborate closely with: Product Managers to translate business requirements into technical solutions. Engineering Teams to architect, design, and oversee implementation. Engineers to mentor and guide them in building innovative analytics-driven applications. Key Responsibilities Architect, design, and implement end-to-end analytics solutions that enable data-driven decision-making. Develop high-performance, scalable data management and analytics architectures. Integrate modern data engineering and analytics technologies into existing systems. Establish best practices for data modeling, data mining, and real-time analytics . Lead the development of custom analytics applications and AI-driven insights . Work with cross-functional teams to ensure seamless data integration and governance. Define and implement disaster recovery and high-availability strategies. Continuously evaluate and recommend improvements to enhance data reliability, performance, and quality . What you bring: Education & Experience Bachelor's/Master's degree in Computer Science, Engineering, or a related field . 10+ years of professional experience in product development, analytics, and architecture . Strong ability to work in an agile environment and cross-functional global teams. Excellent communication, presentation, and strategic thinking skills. Proven expertise in translating complex technical concepts into business-friendly insights . Technical Expertise Proficiency in Java, Python, Scala, or GoLang for analytics and data-driven application development. Expertise in SAP HANA and cloud environments (AWS, Kubernetes, GCP, or Azure) . Strong knowledge of AI/ML models , including large language models (LLMs). Deep understanding of data structures, algorithms, design patterns, and OOAD . Hands-on experience with Relational and Non-relational databases . Expertise in data pipelines, job scheduling, and ETL frameworks . Experience working with Big Data ecosystems (Spark, Hive, Hadoop) is a plus. Preferred Skills Strong background in API development and procurement business processes . Knowledge of data security, compliance, and governance best practices . Ability to drive innovation and continuous improvement in analytics architecture. Self-driven, highly motivated, and a passion for solving complex data challenges Meet the team: SAP Ariba, an SAP Company is the global leader in business commerce networks with over $1 trillion in commerce among more than 2 million connected companies. Today, about 63% of the world's enterprise transaction revenues touch an SAP system. By connecting many of those enterprises and their ecosystem of partners, suppliers and customers to the business network, SAP is aiming to become a leader in the fast-growing segment of cloud-based business networks. These networks enable companies to connect, collaborate, buy and sell across the globe in new and innovative ways. The Global 2000 companies spend $12 trillion with their suppliers today - the fact that the majority of Global 2000 companies are customers of SAP provides a huge market potential for network based business-to-business collaboration. Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP's culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone - regardless of background - feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: [HIDDEN TEXT] For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the . Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 432229 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.
Posted 2 weeks ago
8.0 - 13.0 years
25 - 40 Lacs
pune
Hybrid
Company: Dataceria Software Solutions --------------------------------------------- Location : Pune Work: Hybrid ( 2-3 days in office ) Timings USA and UK timings aligned Position: Senior Backend Developer - JAVA, ETL & Microservices Experience: 8 Plus Job: permanent we are looking for : immediate joiners notice period of 15 -30 days --------------------------------------------------------------------------------------------------------- If you are interested send your cv to careers@dataceria.com With the details below Experience: CTC: ECTC: Notice period: Current work Location: ---------------------------------------------------------------------------------------------------- DATACERIA SOFTWARE SOLUTIONS PVT LTD ---------------------------------------------------------------------------------------------------- Senior Backend Developer - JAVA, ETL & Microservices Dataceria is looking for experienced Senior Java Microservices Developers to join a strategic modernization project for a global client. You will work on complex enterprise applications, collaborating with architects, leads, and other developers to deliver high-quality solutions. ------------------------------------------------------------------------------------------------------------------------------------------- Responsibilities: Develop, maintain, and optimize Java 8+ backend services using Spring Boot and Hibernate/JPA. Build, deploy, and maintain microservices within Kubernetes (AKS or equivalent) . Design and implement ETL pipelines using Apache Airflow, Spring Batch, or Apache Camel. Work with Snowflake to create and manage pipelines, connect and deploy databases, and perform query optimization. Integrate messaging solutions using Kafka or other enterprise messaging frameworks. Collaborate with cloud infrastructure teams to deploy and maintain services in Azure Cloud (or adapt from AWS/GCP). Write and maintain RESTful APIs for microservices communication. Participate in CI/CD pipeline setup , deployment automation, and version control (Git). Collaborate with cross-functional teams in Agile scrum environments. At least 8 years of software development experience (Java) Expert in: Core Java (Java 8+) , J2EE, Spring Boot , Hibernate/JPA REST APIs, JSON, Microservices architecture Spring frameworks ( Spring Boot, Spring Cloud Services, Spring Security, etc .) ETL frameworks like Apache AirFlow , Spring Batch , Apache Camel any one or similr tools Strong experience with SQL (MS-SQL, PostgreSQL), Snowflake, and NoSQL databases (preferably Cosmos DB) or similar tools Proven experience with Azure Cloud, Docker , and Kubernetes (AKS) or similar tools Enterprise messaging systems (Kafka) CI/CD setup and troubleshooting (preferably Azure DevOps) Exceptional leadership and communication skills Strong problem-solving and analytical thinking Ability to manage multiple priorities in a fast-paced environment Adaptability and flexibility to change Willingness to occasionally work outside normal hours Preferred Skills Knowledge of UI technologies (React JS, JavaScript, HTML5, CSS3) Financial/Banking experience Experience with Maven, Gradle, Git DB performance tuning and Infrastructure-as-Code (Terraform) Knowledge of Control-M, Dynatrace, and ServiceNow -------------------------------------------------------------------------------------------------------------------------------------------------- In short summary Mandatory Skills Java 8+ / Backend development Spring Boot / Spring frameworks Hibernate / JPA Microservices Architecture Kubernetes (AKS) OR EKS / GKE ETL Frameworks At least one: Airflow, Spring Batch OR Apache Camel Cloud (Azure) or other Cloud Kafka (Enterprise Messaging) SQL (MS-SQL, PostgreSQL) and NoSQL databases Snowflake (or BigQuery or Redshift or Azure Synapse Analytics) CI/CD (Azure DevOps or similar) Agile Optional Skills Legacy System Experience Terraform (IaC) UI Development (ReactJS, JS, HTML, CSS) ------------------------------------------------------------------------------------------------------------------- Dataceria software Solution Pvt Ltd If you are interested send your cv to careers@dataceria.com With the details below Experience: CTC: ECTC: Notice period: Current work Location:
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
If you are looking for a career at a dynamic company with a people-first mindset and a deep culture of growth and autonomy, ACV is the right place for you! Competitive compensation packages and learning and development opportunities, ACV has what you need to advance to the next level in your career. We will continue to raise the bar every day by investing in our people and technology to help our customers succeed. We hire people who share our passion, bring innovative ideas to the table, and enjoy a collaborative atmosphere. ACV is a technology company that has revolutionized how dealers buy and sell cars online. We are transforming the automotive industry. ACV Auctions Inc. (ACV), has applied innovation and user-designed, data-driven applications and solutions. We are building the most trusted and efficient digital marketplace with data solutions for sourcing, selling, and managing used vehicles with transparency and comprehensive insights that were once unimaginable. We are disruptors of the industry and we want you to join us on our journey. ACV's network of brands includes ACV Auctions, ACV Transportation, ClearCar, MAX Digital, and ACV Capital within its Marketplace Products, as well as True360 and Data Services. ACV Auctions is opening its new India Development Center in Chennai, India, and we're looking for talented individuals to join our team. As we expand our platform, we're offering a wide range of exciting opportunities across various roles. At ACV, we put people first and believe in the principles of trust and transparency. If you are looking for an opportunity to work with the best minds in the industry and solve unique business and technology problems, look no further! Join us in shaping the future of the automotive marketplace! At ACV, we focus on the Health, Physical, Financial, Social, and Emotional Wellness of our Teammates and to support this, we offer industry-leading benefits and wellness programs. We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize performance scalability and reliability. This role requires a strong focus and experience in multi-cloud-based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV's most complex data and software problems. You will be an engineer who is able to operate in a high-performing team, balance high-quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast-paced environment. It is expected that you are a technical liaison that can balance high-quality delivery with customer focus, have excellent communication skills, and have a record of delivering results in a fast-paced environment. In this role, you will collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. You will influence company-wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. You will also design, implement, and maintain tools and best practices for access control, data versioning, database management, and migration strategies. Additionally, you will contribute, influence, and set standards for all technical aspects of a product or service including coding, testing, debugging, performance, languages, database selection, management, and deployment. Identifying and troubleshooting database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions will be part of your responsibilities. Writing clean, maintainable, well-commented code and automation to support our data infrastructure layer, performing code reviews, developing high-quality documentation, and building robust test suites for your products are also key tasks. You will provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborating with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products will be an essential aspect of your role. You will also collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participating in the SOX audits, including creation of standards and reproducible audit evidence through automation, creating and maintaining documentation for database and system configurations, procedures, and troubleshooting guides, maintaining and extending existing database operations solutions for backups, index defragmentation, data retention, etc., responding to and troubleshooting highly complex problems quickly, efficiently, and effectively, being accountable for the overall performance of products and/or services within a defined area of focus, being part of the on-call rotation, handling multiple competing priorities in an agile, fast-paced environment, and performing additional duties as assigned are also part of your responsibilities. To be eligible for this role, you should have a Bachelor's degree in computer science, Information Technology, or a related field (or equivalent work experience), ability to read, write, speak, and understand English, strong communication and collaboration skills with the ability to work effectively in a fast-paced global team environment, 1+ years of experience architecting, developing, and delivering software products with an emphasis on the data infrastructure layer, 1+ years of work with continuous integration and build tools, 1+ years of experience programming in Python, 1+ years of experience with Cloud platforms preferably in GCP/AWS, knowledge in day-to-day tools and how they work including deployments, k8s, monitoring systems, and testing tools, knowledge in version control systems including trunk-based development, multiple release planning, cherry-picking, and rebase, hands-on skills and the ability to drill deep into the complex system design and implementation, experience with DevOps practices and tools for database automation and infrastructure provisioning, programming in Python, SQL, Github, Jenkins, infrastructure as code tooling such as terraform (preferred), big data technologies, and distributed databases. Nice to have qualifications include experience with NoSQL data stores, Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran, database monitoring and diagnostic tools, preferably Data Dog, database management/administration with PostgreSQL, MySQL, Dynamo, Mongo, GCP/BigQuery, Confluent Kafka, using and integrating with cloud services, specifically AWS RDS, Aurora, S3, GCP, Service Oriented Architecture/Microservices, and Event Sourcing in a platform like Kafka (preferred), familiarity with DevOps practices and tools for automation and infrastructure provisioning, hands-on experience with SOX compliance requirements, knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks, knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning, and optimization techniques. Our Values: - Trust & Transparency - People First - Positive Experiences - Calm Persistence - Never Settling,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The 2nd Line Assurance function at VOIS is responsible for monitoring and mitigating key technology risks, fostering a risk-aware culture, and advocating a risk-sensitive approach to processes, architectures, applications, and platforms within the Technology domain. The team collaborates closely with local markets and Group functions to test established controls and identify corrective actions to manage risks effectively. Reporting to key stakeholders, including the Technology leadership team and board committees, is a crucial aspect of this role. As a part of VOIS, you will be expected to possess a technical degree, with a preference for BE/BTech qualifications. Moreover, familiarity with relevant frameworks such as SOX, ISO 27001/27002, and COBIT is essential. Proficiency in various platforms including databases (SQL, Oracle, MySQL), reporting packages (Power BI, Qlik View, Business Objects), programming languages (XML, Javascript, ETL frameworks), applications (CRM, HR), operating systems (LINUX, Windows), IT networks, firewalls, VPNs, GSM network infrastructure, and telecommunications is highly valued for this role. VOIS is committed to being an Equal Opportunity Employer in India. This commitment is reflected in our employees" dedication, earning us certification as a Great Place to Work in India for four consecutive years. Notably, we have been recognized among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM, and ranked 10th in the Overall Best Workplaces in India by the Great Place to Work Institute in 2023. By joining VOIS, you become part of this commitment, joining a diverse family of individuals with varied cultural backgrounds, perspectives, and skills. If you are looking to be part of a dynamic team and contribute to managing key technology risks within a global organization, we invite you to apply now. We look forward to welcoming you and exploring the possibilities of working together at VOIS.,
Posted 2 weeks ago
0.0 years
0 Lacs
pune, maharashtra, india
On-site
Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process - through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services) Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects Experience with time-series/analytics dB's such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF's for analysing semi structured/structured datasets You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSDI
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
pune, maharashtra, india
On-site
Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process - through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services) Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects Experience with time-series/analytics dB's such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF's for analysing semi structured/structured datasets You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSDI
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a .Net Software Engineer at Barclays, you will play a crucial role in supporting the successful delivery of location strategy projects by ensuring they meet plan, budget, quality, and governance standards. Your responsibilities will involve driving innovation and excellence in the digital landscape, utilizing cutting-edge technology to enhance our digital offerings and provide exceptional customer experiences. To excel in this role, you should possess a strong understanding of Object-Oriented Programming (OOPs) and Design Patterns. Additionally, you should have experience with the .Net framework (3.5 and above), C#, WPF, MVVM, and Python coding/deployments. Knowledge of deployment processes using tools like Bit Bucket, GitHub, Veracode, SonarQube for Code Quality, and experience with C# and WPF are essential. You should be capable of drafting technical specifications and effectively communicating with multiple stakeholders. Moreover, familiarity with MS SQL Server Database (T-SQL, Stored Procedures, Triggers, Functions, SSIS), programming languages (XML, ETL frameworks), and Data Warehousing is valuable. Strong self-motivation along with excellent written and verbal communication skills are also key attributes for success in this role. Desirable skills may include experience with custom workflow solutions, commercial CRM/workflow solutions, Python coding, and deployment. You should be comfortable working in roles like Scrum Master, Product Owner, and adapting to a fast-paced delivery environment with a focus on minimum viable product. Experience in delivering change and collaborating with stakeholders from various functions within a financial services organization is beneficial. Your primary goal will be to design, develop, and enhance software using various engineering methodologies to provide business, platform, and technology capabilities for customers and colleagues. You will collaborate with product managers, designers, and engineers to define software requirements, ensure code quality, and promote knowledge sharing. Staying updated on industry technology trends, contributing to technical communities, and adhering to secure coding practices are integral parts of this role. As an Analyst, you are expected to perform activities efficiently, demonstrate in-depth technical knowledge, lead and supervise a team if applicable, and contribute to the improvement of related teams and functions. You will be accountable for embedding new policies and procedures for risk mitigation, influencing decision-making, managing risks, and strengthening controls in your work area. Additionally, you should maintain a good understanding of how your sub-function integrates with the function and the organization's products, services, and processes. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset to Empower, Challenge, and Drive.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Software Engineer - .Net at Barclays, you will be responsible for independently producing requirements breakdown, writing user stories, and managing the delivery in accordance with Business requirement design. You will ensure all requirements traceability is factored into design/code/test deliverables. Additionally, you will identify and manage necessary controls/RAIDS and assist in diagnosing system problems encountered during various phases. Documenting functional specifications and changes and maintaining the application knowledge repository will also be part of your responsibilities. To be successful in this role, you should have a strong understanding of OOPs and Design Patterns. Knowledge of Dot net framework 3.5 and above, C#, WPF, MVVM, deployment process knowledge, Bit Bucket, GitHub, Veracode, SonarQube for Code Quality, and proven working experience in C# with WPF is essential. You should be well-versed in drafting technical specifications and justifying them with multiple stakeholders. Experience with MS SQL Server Database, programming (XML or ETL frameworks), Data Warehousing, and self-motivation with good written and verbal communication skills are also required. Additionally, highly valued skills may include experience with custom workflow solutions or commercial CRM/workflow solutions, working in a Scrum Master and/or Product Owner capacity, delivering change with multiple stakeholders from varied functions within a financial services organization, knowledge of Barclays Systems, Products, and processes, and embracing Barclays Values & Behaviours. The purpose of the role is to design, develop, and improve software using various engineering methodologies to provide business, platform, and technology capabilities for customers and colleagues. Your accountabilities will include developing and delivering high-quality software solutions, collaborating with product managers, designers, and other engineers, participating in code reviews, and promoting a culture of code quality and knowledge sharing. In terms of analyst expectations, you are expected to perform prescribed activities in a timely manner to a high standard, drive continuous improvement, lead and supervise a team, guide and support professional development, and demonstrate a clear set of leadership behaviours. You will also partner with other functions and business areas, take responsibility for embedding new policies/procedures, advise and influence decision-making, manage risk, and strengthen controls in relation to your work. Overall, as a Software Engineer - .Net at Barclays, you will play a crucial role in developing software solutions, collaborating with various stakeholders, and contributing to the technical excellence and growth of the organization while upholding the Barclays Values and Mindset.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are looking for a hands-on Data Engineer who will be responsible for designing, building, and maintaining scalable data ingestion pipelines. Your main focus will be on ensuring the delivery of high-quality, reliable, and scalable data pipelines to support downstream analytics, machine learning, and business intelligence solutions. You will work with various internal and external sources to onboard structured and semi-structured data using Azure-native services like Data Factory, Azure Data Lake, Event Hubs, as well as tools like Databricks or Apache Spark for data ingestion and transformation. Your responsibilities will include developing metadata-driven ingestion frameworks, collaborating with source system owners to define data ingestion specifications, implementing monitoring/alerting on ingestion jobs, and embedding data quality, lineage, and governance principles into ingestion processes. You will also optimize ingestion processes for performance, reliability, and cloud cost efficiency, and support both batch and real-time ingestion needs, including streaming data pipelines where applicable. To qualify for this role, you should have at least 3 years of hands-on experience in data engineering, with a specific focus on data ingestion or integration. You should have hands-on experience with Azure Data Services or equivalent cloud-native tools, experience in Python (PySpark) for data processing tasks, and familiarity with ETL frameworks, orchestration tools, and working with API-based data ingestion. Knowledge of data quality and validation strategies, CI/CD practices, version control, and infrastructure-as-code are also required. Bonus qualifications include experience with SAP.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As an integral part of our team at Baxter Planning, you will play a crucial role in ensuring the reliable operation and continuous enhancement of our production environment. Your responsibilities will involve supporting and documenting scheduled production processes, monitoring batch jobs, data loads, and integrations, as well as maintaining automation, standard operating procedures, and change logs. Troubleshooting complex application and environment issues, coordinating with various internal teams to resolve incidents, and assisting in the configuration and operational deployment of Baxter Planning software will be key aspects of your role. Your contributions will have a direct impact on customer success and operational excellence. If you enjoy working in a fast-paced environment, delving into data pipelines, and effectively communicating complex ideas, you will find a perfect fit within our dynamic team. Key Responsibilities: - Support and document scheduled production processes - Monitor batch jobs, data loads, and integrations - Maintain automation, standard operating procedures, and change logs - Troubleshoot complex application and environment issues - Coordinate with infrastructure, network, and development teams to resolve incidents - Assist with the configuration and operational deployment of Baxter Planning software - Develop solutions to connect customer systems and extract and transform data - Validate and test configurations against business requirements - Investigate issues/incidents effectively and communicate findings in a root cause analysis Job Requirements: - Bachelor's degree or equivalent experience preferred - 3-4 years of experience in a similar or related role - Proficiency in working with Linux environments and using command-line tools - Strong SQL skills (PostgreSQL) - Analytical mindset with a passion for root-cause investigation - Detail-oriented, reliable, and willing to go beyond the norm to achieve results - Ability to work both independently with minimal direction and in a team environment - Excellent interpersonal, written, and oral communication skills Preferred Qualifications: - Experience with supply-chain planning systems - Exposure to AWS cloud platform - Proficiency with ETL frameworks At Baxter Planning, we nurture a culture of collaboration, continuous learning, and innovation. You will be part of a close-knit team that values open communication, ownership of projects, proactive problem-solving, and recognition of individual contributions. We invest in your professional growth through training and mentorship, ensuring a balance between focused work and a healthy work-life rhythm. Join us at Baxter Planning and be a key player in maintaining the smooth operation of global supply chains. Benefit from collaborating with a diverse group of professionals, accessing continuous learning opportunities in data engineering and analytics, and enjoying competitive compensation, comprehensive benefits, and flexible work arrangements. If you are ready to drive supply-chain excellence through reliable production support, apply now and contribute to ensuring our planning engine runs flawlessly.,
Posted 4 weeks ago
8.0 - 12.0 years
0 - 0 Lacs
pune, maharashtra
On-site
Keyrus is searching for experienced Senior Java Microservices Developers to participate in a strategic modernisation project for a global client. As a Senior Java Microservices Developer, you will be involved in the development of complex enterprise applications, collaborating with architects, leads, and fellow developers to deliver high-quality solutions. The target start date for this position is September 2025, and the job role follows a hybrid model with 3 days of work at the Pune Office in India. The salary range for this role is between 25,00,000 to 35,00,000 INR. Your responsibilities will include developing, testing, and maintaining scalable microservices and back-end applications using Java and Spring Boot. You will be tasked with building and maintaining RESTful APIs, JSON-based services, and microservices architectures. Additionally, you will work with Spring frameworks such as Spring Boot, Spring Cloud, and Spring Security. Your duties will also involve implementing ETL processes using tools like Apache AirFlow, Spring Batch, and Apache Camel. Writing optimised SQL queries for relational databases, working with Snowflake and NoSQL (Cosmos DB preferred), and deploying and managing applications in Azure Cloud with Docker and Kubernetes (AKS) are also key aspects of this role. Furthermore, you will be expected to integrate with enterprise messaging systems like Kafka, troubleshoot and optimise application performance, and collaborate with DevOps to maintain CI/CD pipelines (Azure DevOps preferred). Following Agile development practices and participating in daily scrums, sprint planning, and code reviews are essential parts of this position. The ideal candidate for this role should possess at least 8 years of software development experience in Java. You should be an expert in core Java (Java 8+), J2EE, Spring Boot, Hibernate/JPA, REST APIs, JSON, Microservices architecture, Spring frameworks, ETL frameworks, SQL (MS-SQL, PostgreSQL), Snowflake, NoSQL databases (preferably Cosmos DB), Azure Cloud, Docker, Kubernetes (AKS), enterprise messaging systems (Kafka), CI/CD setup and troubleshooting (preferably Azure DevOps). Exceptional leadership and communication skills, strong problem-solving and analytical thinking, and the ability to manage multiple priorities in a fast-paced environment are highly valued. Adaptability, flexibility to change, and occasional willingness to work outside normal hours are also important attributes. Preferred skills for this role include knowledge of UI technologies (React JS, JavaScript, HTML5, CSS3), experience in the Financial/Banking domain, familiarity with Maven, Gradle, Git, DB performance tuning, Infrastructure-as-Code (Terraform), Control-M, Dynatrace, ServiceNow, and strong Unix commands and scripting skills. Joining Keyrus means becoming a part of a market leader in the Data Intelligence field and a prominent player in Management Consultancy and Digital Experience. You will have the opportunity to work in a dynamic environment with an established international network of professionals committed to bridging the gap between innovation and business. Keyrus offers a platform to showcase your talents, build experience through client projects, and grow based on your capabilities and interests in a supportive and dynamic atmosphere. Additionally, Keyrus UK provides various benefits including a competitive holiday allowance, a comprehensive Private Medical Plan, flexible working patterns, a Workplace Pension Scheme, Sodexo Lifestyle Benefits, a Discretionary Bonus Scheme, a Referral Bonus Scheme, and Training & Development via KLX (Keyrus Learning Experience).,
Posted 1 month ago
10.0 - 15.0 years
0 Lacs
karnataka
On-site
As a seasoned professional in the field of data analytics, you will play a crucial role in building trust with senior stakeholders by providing strategic insights and demonstrating delivery credibility. Your expertise will be instrumental in translating complex client business problems into effective Business Intelligence solutions and successfully implementing them. You will oversee multiple BI and analytics programs for various clients, ensuring timely delivery and collaboration with Data Engineering teams to achieve common goals. Your responsibilities will include ensuring the scalability and quality of deliverables that have a direct impact on business outcomes. Additionally, you will be involved in recruiting and onboarding team members, as well as directly managing a team of 15-20 individuals. It will be essential for you to take ownership of customer deliverables and work closely with project managers to align project schedules with customer expectations. With over 15 years of experience, including a minimum of 10 years in data analytics execution, you possess strong organizational and multitasking skills. You have the ability to engage effectively with clients from both business and technical backgrounds, understand their workflows, reporting requirements, and decision-making processes. Your analytical skills enable you to analyze and present data effectively, driving actionable insights that lead to improved Key Performance Indicators (KPIs). Your proficiency in BI tools such as Power BI, Tableau, and Qlik, as well as backend systems like SQL and ETL frameworks, will be critical to your success in this role. Experience with cloud-native platforms such as Snowflake, Databricks, Azure, and AWS, along with data lakes, is highly valued. Knowledge of compliance, access controls, and data quality frameworks is a plus, as is experience in CPG, Supply Chain, Manufacturing, and Marketing domains. You will need to leverage your problem-solving skills to prioritize conflicting requirements and effectively communicate with senior management, other departments, and external partners. Your excellent written and verbal communication skills will be key to summarizing key findings succinctly and driving decision-making processes based on clear insights.,
Posted 1 month ago
12.0 - 16.0 years
0 Lacs
noida, uttar pradesh
On-site
You will lead complex projects from inception to completion, showcasing your expertise in Business and functional areas. As a subject matter expert, you will provide advice to cross-functional teams and stakeholders on best practices and innovative solutions. Your responsibilities will include conducting in-depth analysis of complex datasets to identify trends, patterns, and optimization opportunities. Driving continuous improvement initiatives to enhance processes and workflows will be crucial. Collaborating closely with leadership to establish and execute strategic objectives aligned with organizational goals is a key aspect of the role. You will conduct comprehensive research and analysis to recognize emerging trends and opportunities within the industry. Delivering high-quality work within specified timelines while maintaining adherence to company standards and policies is essential. Building strong relationships with clients, partners, and vendors to foster collaboration and achieve mutually beneficial outcomes will be part of your responsibilities. Staying updated on industry advancements and engaging in professional development activities to uphold expertise is expected. You will need to possess a Bachelors/Masters Degree in CS/IT from a reputed college and demonstrate 12-16 years of experience in Data and Business Analysis. Technical proficiency in data models, database design development, data mining, and segmentation techniques is required. Hands-on experience with SQL, reporting packages (e.g., Business Objects), programming (XML, Javascript, ETL frameworks), and number crunching skills are essential. Moreover, you should have a strong understanding of statistics and proficiency with statistical packages like Excel, SPSS, SAS. Exceptional analytical skills, the ability to collect, organize, analyze, and present information accurately with attention to detail, are necessary. Being adept at queries, report writing, and presenting findings is crucial. Excellent written and verbal communication skills are a must. This position is located in Sector 125, Noida, and requires onsite work.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You should have at least 10 years of overall development experience with at least 5 years in a Data Engineering role. Your responsibilities will include building and optimizing big data pipelines, architectures, and data sets. You should have a strong background in writing SQL statements and experience in Spring/Spring Boot framework. Additionally, you should have experience in relational databases like Postgres, Oracle, SnowFlake, BigQuery, and other cloud databases. Experience in implementing web services such as SOAP and RESTful web services is required. Knowledge of frontend frameworks like Angular, jQuery, and Bootstrap is also expected. You should be familiar with real-time and batch data processing, ETL frameworks like Google Cloud Data platform or Apache Beam, and analyzing data to derive insights. Leading small to midsize technical teams, customer-facing experience, and managing deliverables are also part of the role. Good verbal and written communication skills are essential, along with an advanced understanding of modern software development methodologies and software testing methodologies, scripting, and tools. You should have a minimum of three or more full SDLC experiences for web application projects. Experience in Agile development environments and messaging platforms like ActiveMQ would be a plus.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an AWS Consultant specializing in Infrastructure, Data & AI, and Databricks, you will play a crucial role in designing, implementing, and optimizing AWS Infrastructure solutions. Your expertise will be utilized to deliver secure and scalable data solutions using various AWS services and platforms. Your responsibilities will also include architecting and implementing ETL/ELT pipelines, data lakes, and distributed compute frameworks. You will be expected to work on automation and infrastructure as code using tools like CloudFormation or Terraform, and manage deployments through AWS CodePipeline, GitHub Actions, or Jenkins. Collaboration with internal teams and clients to gather requirements, assess current-state environments, and define cloud transformation strategies will be a key aspect of your role. Your support during pre-sales and delivery cycles will involve contributing to RFPs, SOWs, LOEs, solution blueprints, and technical documentation. Ensuring best practices in cloud security, cost governance, and compliance will be a priority. The ideal candidate for this position will possess 3 to 5 years of hands-on experience with AWS services, a Bachelor's degree or equivalent experience, and a strong understanding of cloud networking, IAM, security best practices, and hybrid connectivity. Proficiency in Databricks on AWS, experience with data modeling, ETL frameworks, and working with structured/unstructured data are required skills. Additionally, you should have working knowledge of DevOps tools and processes in the AWS ecosystem, strong documentation skills, and excellent communication abilities to translate business needs into technical solutions. Preferred certifications for this role include AWS Certified Solutions Architect - Associate or Professional, AWS Certified Data Analytics - Specialty (preferred), and Databricks Certified Data Engineer Associate/Professional (a plus).,
Posted 1 month ago
10.0 - 12.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Tarento Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions. We&aposre proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, youll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Must-Have Skills 1015 years of experience in data engineering/architecture with leadership roles Expertise in PySpark, AWS Glue, and designing scalable cloud-native systems Proven experience architecting large-scale batch and real-time data platforms Advanced design proficiency in S3-based data lakes and related optimization techniques Practical knowledge of AWS services: Lambda, Athena, IAM, CloudWatch, Step Functions, IoT FleetWise, IoT Core, MSF, MSK Strong technical leadership and ability to mentor teams Demonstrated domain experience in connected vehicles, IoT/telemetry, or streaming platforms Good-to-Have Skills Familiarity with OpenSearch or Elasticsearch Experience with Snowflake and real-time data technologies (Kafka, Flink, Kinesis) Understanding of distributed system designs Key Responsibilities Lead strategy and design for data platforms in connected vehicle and digital mobility products Translate business requirements into detailed technical architecture, including integration, security, and data flow Design and manage robust ETL frameworks using cloud-native tools Architect systems to handle high-throughput telemetryover 1 billion records per minute Develop decisions regarding data formats, partitioning, metadata, schema evolution, and monitoring Knowledge of CI/CD, Infrastructure as Code (IaC), and optimal cloud provisioning Facilitate cross-team alignment for seamless cloud-to-device telemetry Partner with executives and R&D to influence long-term data solutions Who You Are Visionary architect focused on future-proof, scalable data systems Strategic thinker with strong communication skills for both technical and executive audiences Change agent comfortable in fast-evolving environments Eligibility Education: BTech/MTech in Computer Science, Data Engineering, or related field Experience: Minimum 10 years in architecture, data engineering, or cloud platform leadership Why Join Lead innovation in next-gen mobility for millions globally Work with advanced technologies in high-volume data environments Flourish in a growth-focused, flexible, and learning-driven culture Show more Show less
Posted 1 month ago
10.0 - 17.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Sigmoid: Sigmoid empowers enterprises to make smarter, data-driven decisions by blending advanced data engineering with AI consulting. We collaborate with some of the worlds leading data-rich organizations across sectors such as CPG-retail, BFSI, life sciences, manufacturing, and more to solve complex business challenges. Our global team specializes in cloud data modernization, predictive analytics, generative AI, and DataOps, supported by 10+ delivery centers and innovation hubs, including a major global presence in Bengaluru and operations across the USA, Canada, UK, Netherlands, Poland, Singapore, and India. Recognized as a leader in the data and analytics space, Sigmoid is backed by Peak XV Partners and has consistently received accolades for innovation and rapid growth. Highlights include being named a Leader in ISGs Specialty Analytics Services for Supply Chain (2024), a two-time India Future Unicorn by Hurun India, and a four-time honoree on both the Inc. 500 and Deloitte Technology Fast 500 lists. Director - Data Analytics: This role will be a leadership position in the data science group at Sigmoid. An ideal person will come from a services industry background with a good mix of experience in solving complex business intelligence and data analytics problems, team management, delivery management and customer handling. This position will give you an immense opportunity to work on challenging business problems faced by fortune 500 companies across the globe. The role is part of the leadership team and includes accountability for a part of her/his team and customers. The person is expected to be someone who can contribute in developing the practice with relevant experience in the domain, nurturing the talent in the team and working with customers to grow accounts. Responsibilities Include Build trust with senior stakeholders through strategic insight and delivery credibility. Ability to translate ambiguous client business problems into BI solutions and ability to implement them. Oversight of multi-client BI and analytics programs with competing priorities and timelines, while collaborating with Data Engineering and other functions on a common goal. Ensure scalable, high-quality deliverables aligned with business impact. Help recruiting and onboarding team members; directly manage 15 - 20 team members. You would be required to own customer deliverables and ensure, along with project managers, that the project schedules are in line with the expectations set to the customers. Experience and Qualifications 15+ years of overall experience with a minimum of 10+ years in data analytics execution. Strong organizational and multitasking skills with the ability to balance multiple priorities. Highly analytical with the ability to collate, analyze and present data and drive clear insights to lead decisions that improve KPIs. Ability to effectively communicate and manage relationships with senior management, other departments and partners. Mastery of BI tools (Power BI, Tableau, Qlik), backend systems (SQL, ETL frameworks) and data modeling. Experience with cloud-native platforms (Snowflake, Databricks, Azure, AWS), data lakes. Expertise in managing compliance, access controls, and data quality frameworks is a plus. Experience working in CPG, Supply Chain, Manufacturing and Marketing domains are a plus. Strong problem-solving skills and ability to prioritize conflicting requirements. Excellent written and verbal communication skills and ability to succinctly summarize the key findings. Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |