Home
Jobs
Companies
Resume

1683 Rdbms Jobs - Page 50

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1 - 4 years

3 - 7 Lacs

Gurgaon

Work from Office

Naukri logo

You Lead the Way. Weve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, youll learn and grow as we help you create a career journey thats unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, youll be recognized for your contributions, leadership, and impact every colleague has the opportunity to share in the companys success. Together, well win as a team, striving to uphold our company values and powerful backing promise to provide the worlds best customer experience every day. And well do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and lets lead the way together. What were looking for: Youre a talented, creative, and motivated engineer who loves developing powerful, stable, and intuitive apps and youre excited to work with a team of individuals with that same passion. Youve accumulated years of experience, and youre excited about taking your mastery of Cloud, Big Data and Java to a new level. You enjoy challenging projects involving big data sets and are cool under pressure. Youre no stranger to fast-paced environments and agile development methodologies in fact, you embrace them. With your strong analytical skills, your unwavering commitment to quality, your excellent technical skills, and your collaborative work ethic, youll do great things here at American Express. Purpose of the Role: American Express is on a path to migrate big data workloads from on-prem to a cloud hosted solution. As an Engineer you will deliver technical solutions using cutting edge technologies and industry best practices to implement cloud native tools in large scale migration. Experience in architecting, deploying, automating cloud-based data platforms. Should possess skills in technologies such as GCP, Big data, Java, Scala, Spark, Kafka, APIs, Python, RDBMS, NoSQL etc. You should have experienced in building and using low code/no code data transformation tools for setting up complex rules. Responsibilities: * As a Big Data Engineer, youll be responsible for designing and building high performance, and scalable data platforms * You should be a quick learner to understand user needs and guide them towards most suitable path * You will be required to effectively collaborate with product teams from business group and understand the user needs, requirements, roadmap and vision of different tooling * You will ride high on creating automated tools that will cater to multiple usecases and speed up the enablement process * You will work with a variety of teams and individuals, including platform engineers, usecase owners, analytical users to understand their needs and come up with innovative solutions * You will follow the Amex-way of building engineering products that leads to engineering excellence by adopting DevOps principals Qualifications: * Bachelors degree in Computer Science, Engineering, or a related field. Masters degree would be a plus * Great to have :- GCP professional certification - Data Engineer/Cloud Architect will be preferable * 6+ years of software development experience with hands-on expertise in coding in Java/Python/Scala etc * 4+ years of experience in setting up ETL jobs on low-code UI * 2+ years of experience in using GCP cloud tools such as BigQuery, Data Proc, Data Flow, Cloud SQL, Pub Sub etc. * Strong SQL, RDBMS skills. Expert in writing complex SQLs for different databases such as Hive , MySQL , Postgres etc. Proficiency in working with NoSQL databases as well * Experience working with Spark , Big Data and Hive * Experience in Git Management including PR reviews, maintaining code hygiene * In-depth understanding of data warehousing concepts, dimensional modelling, and data integration techniques * Experience in optimizing high volume data processing jobs. * Hands-on experience on CICD pipelines, Automated test frameworks, DevOps and source code management will be a big plus (XLR, Jenkins, Git, Stash, Jira, Confluence, Splunk etc. ) * Experience working in Agile/SAFe framework for development * Excellent communication and analytical skills * Excellent team-player with ability to work with global team Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities

Posted 3 months ago

Apply

3 - 8 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

You Lead the Way. We ve Got Your Back. At American Express, you ll be recognized for your contributions, leadership, and impact every colleague has the opportunity to share in the company s success. Together, we ll win as a team, striving to uphold our company values and powerful backing promise to provide the world s best customer experience every day. And we ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and lets lead the way together. Our Global Commercial Services team is seeking a driven, well-rounded computer science professional to join our growing team. You must be a self-starter! Eager to learn new technology trends, and partner with the business teams to ensure technology changes do not impact our customers. If you are an engineer with a passion for operating things and the desire to understand full stack (business down to infrastructure), and you embrace the instinct to code problems away, then this dynamic role is for you! Roles & Responsibilities Collaborate with product team & technical experts to identify optimal solutions for business requirements. Contribute to analysis, design of software components, and develop software with highest quality Understand & adopt best practices. enterprise guidelines during software development. Identify technical debts in the platform and work with leads & product owner to triage based on criticality. Provide post-implementation production support for applications/services. Guide junior team members in the project during software development. Contribute to CI/CD initiatives like Test Automation, Alerting & Monitoring strategy etc. , Leadership Excellent communication & collaboration skills Willingness to learn/experiment new technologies/tools Minimum Qualifications Experience Bachelor s Degree in related field preferred; proven industry experience. 3-8 years of full stack software development with good hands-on experience across all tiers of the application. Experience in building & hosting applications on cloud is a plus. Experience in developing software applications using agile methodologies. Experience in supporting production issues post implementation. Experience in . Net programming/development is a plus. Knowledge/Skills Full stack software development using Java 17 or higher with Spring framework, Spring Boot, REST APIs, microservices architecture, RDBMS(Postgres/Oracle or similar) , UI design & development using Angular or React Good understanding of C#, Core, and Entity Framework Experience in delivering software using DevOps practices like CI, CD, Automated testing, Alerting & Monitoring etc. , Good understanding of application security principles & remediating vulnerabilities. Understanding of multi-tier application architectures and related development Proven experience in creating automated tests using test frameworks. Knowledge and understanding of the SDLC principles. Understanding of BDD & TDD practices. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities

Posted 3 months ago

Apply

7 - 10 years

17 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description: Role Title: Product Engineer, ACM (L09) Company Overview: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India s Best Companies to Work for by Great Place to Work. We were among the Top 50 India s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles Organizational Overview: Synchronys Engineering Team is a dynamic and innovative team dedicated to driving technological excellence. As a member of this Team, youll play a pivotal role in designing and developing cutting-edge tech stack and solutions that redefine industry standards. The Credit Card that we use every day to purchase our essentials and later settle the bills - A simple process that we all are used to on a day to day basis. Now, consider the vast complexity hidden behind this seemingly simple process, operating tirelessly for millions of cardholders. The sheer volume of data processed is mind-boggling. Fortunately, advanced technology stands ready to automate and manage this constant torrent of information, ensuring smooth transactions around the clock, 365 days a year. Our collaborative environment encourages creative problem-solving and fosters career growth. Join us to work on diverse projects, from fintech to data analytics, and contribute to shaping the future of technology. If youre passionate about engineering and innovation, Synchronys Engineering Team is the place to be Role Summary/Purpose: Billions of transactions and you ll touch all of them if you join our IT team as Product Engineer, ACM Imagine the sheer scale of what we impact every second of every day. Now imagine what you can do with that influence. This is where you can shape the future of Servicing our customers. As an Product Engineer, ACM, you ll be building Microservices, MFE s, API s and managing an amazing team of engineers working on our applications leveraging cloud technologies. It s the ideal time to come aboard - we re focused on the future, continuing to evolve a company and help define the financial technology industry. With so much opportunity available, this is where you can make your mark. Key Responsibilities: Work as Product Engineer with expertise in Advanced Case Management with IBM Case Manager and FileNet P8 platform. Work with business and IT stakeholders to understand business needs in the area of case management and provide solutions; provide technical leadership to development team. Work with business analysts, developers, project managers, and users to capture requirements, provide solution design and govern implementation Required Skills/Knowledge: Bachelor s degree in any engineering discipline or MCA with 2+ years of working experience on IBM Case Manager or in lieu of degree 4+ years of working experience on IBM Case Manager. Must have experience working on IBM Case Manager 5. 1/5. 2/5. 3 Experience in Case management and Business Process Management solution design and development using IBM Case Manager and IBM FileNet P8 stack. Experience with customization of IBM Case Manager Solutions, development of widgets, External Data Services, Case Manager API. Experience in developing restful services using spring boot. Excellent Oral and written communication. Flexible to work across time-zones if needed. Desired Skills/Knowledge: Familiarity in designing application using SOLID principles, Java and microservice design patterns with business acumen Working knowledge in RDBMS Ability to analyze, use structured problem solving and available tools to troubleshoot systems, identify root cause, action plans, impact and resolution options Eligibility Criteria: Bachelor s degree in any engineering discipline or MCA with 2+ years of working experience on IBM Case Manager or in lieu of degree 4+ years of working experience on IBM Case Manager. Work Timings: 2 PM - 11 PM IST (This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time - 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details . ) For Internal Applicants : Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L4 to L7 Employees who have completed 12 months in the organization and 12 months in current role and level are only eligible. L8+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L04+ Employees can apply Grade/Level: 09 Job Family Group: Information Technology

Posted 3 months ago

Apply

7 - 10 years

15 - 16 Lacs

Chennai

Work from Office

Naukri logo

. Job Summary dx data engineering team is involved in data analytics, ELT and load semantics of inhouse self-service products and certifies the product according to Business needs and standards. The dx Data Engineer need to work on developing and testing applications on backend systems/platforms. Job Description Core Responsibilities Expert level in DWH, ETL, RDBMS, SQL, Hadoop/Big Data, Spark, Data modelling Expert level in Bigdata eco system/ Cloud (AWS) technologies, DevOps Nice to have experience in Dockers/ Kubernetes/ MinIO Experience in handling enterprise data lake/data warehouse solutions Responsible for taking accountability in delivering solutions based on data lineages Supervise team for best-in-class implementations Responsible for planning, design, and implementation Agile in critical problem-solving, solution orientation Develop end to end business applications (backend) Works hand-in-hand with core team Good appetite for continuous improvement and innovation Experience on agile methodology Success in stakeholder management Collaborate with product development teams Consistent exercise of independent judgment and discretion in matters of significance Proactive participation and effective communication in team discussions Other duties and responsibilities as assigned Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Skills Apache Spark, AWS Cloud Computing, Big Data, Databricks Platform, Data Warehousing (DW), ETL Development, Innovation, Structured Query Language (SQL), Teradata Database We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. Thats why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relative Work Experience 7-10 Years Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.

Posted 3 months ago

Apply

7 - 10 years

15 - 16 Lacs

Chennai

Work from Office

Naukri logo

. Job Summary dx data engineering team is involved in data analytics, ELT and load semantics of inhouse self-service products and certifies the product according to Business needs and standards. Responsible for planning and designing new/existing applications. Assists with tracking performance metrics. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. Lead teams or projects and shares expertise. Job Description Core Responsibilities Expert level in DWH, ETL, RDBMS, SQL, Hadoop/Big Data, Spark, Data modelling Expert level in Bigdata eco system/ Cloud (AWS) technologies, DevOps Nice to have experience in Dockers/ Kubernetes/ MinIO Experience in handling enterprise data lake/data warehouse solutions Responsible for taking accountability in delivering solutions based on data lineages Supervise team for best-in-class implementations Responsible for planning, design, and implementation Agile in critical problem-solving, solution orientation Develop end to end business applications (backend) Works hand-in-hand with core team Good appetite for continuous improvement and innovation Experience on agile methodology Success in stakeholder management Collaborate with product development teams Consistent exercise of independent judgment and discretion in matters of significance Proactive participation and effective communication in team discussions Other duties and responsibilities as assigned Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Skills Agile Methodology, Apache Spark, AWS Cloud Computing, Big Data, Databricks Platform, Data Warehousing (DW), ETL Development, Python (Programming Language), Structured Query Language (SQL), Teradata Database We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. Thats why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relative Work Experience 7-10 Years Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.

Posted 3 months ago

Apply

12 - 14 years

20 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Database Engineer, KBS Support Ops We are hiring a SQL Server Engineer and architect to join our team. You will be responsible for creating innovative software solutions using existing SQL server tech stack and related technologies. Our ideal candidate has a deep understanding of database technologies and is familiar with various design and architectural patterns for data storage techniques. As a SQL server engineer and DBA, you will not only design and develop the backend of cutting-edge web applications using the latest technologies, but also lead our application stack to the next level of database technologies leveraging Azure and AWS services. You will be responsible for overseeing quality standards to ensure the best practices are being used in development across all persistence layers and collaborating with the offshore development team. You will also communicate effectively with the team and provide technical guidance and support as needed. The day-to-day responsibilities include but not limited to: The Database Administrator will be assigned to work with one or more development project teams. In this role they will be expected to: Act as a technical subject matter expert for all persistence layers used in development, with specific skills in Microsoft SQL Server and T-SQL. Identify the appropriate persistence layer(s) for new development, considering both availability, performance, size and security. Work with SCRUM teams in an Agile development environment. Working with the Solutions Designer, Development Architect and third parties as appropriate, the engineer will be responsible for ensuring that a technical design document describing the technical architecture for the program/project is produced. Maintain data models, schema diagrams and data dictionaries for business-critical data systems Understand the business and application requirements and translate into a database design. Ensure that all technology selected for the program conforms to the organization s technical long-term strategy and compliance requirements. Be responsible for the availability, stability, recoverability and performance of the database environment by ensuring all systems are monitored, patched, and operating optimally. Design and develop complex SQL queries and appropriate database code such as stored procedures, triggers, functions to support complex business processes. Identify, troubleshoot and address poorly performing queries, database waits and deadlocks in production systems. Develop and deliver extensible/sustainable/quality database solutions based on defined business needs. Review and update stored procedures for optimal performance. Identify and correct performance bottlenecks related to SQL code. Recommend changes to the relational data model to solve performance issues with SQL code. Monitor and troubleshoot SQL jobs and processes using available tools such as SQL Profiler, Redgate, review logs and triage issues. Participate in the resolution of production data issues. Perform code reviews, providing feedback in a timely manner. Incorporate feedback from code reviews as appropriate. Participate in the review of all database schema projects; provide feedback and identify impacts to existing environment. Participate in the release of all stored procedure projects, including release documentation. Maintain knowledge as new versions of SQL Server are released and leverage new functionality as appropriate. Proactively identify and act on opportunities to improve database systems and processes. Contribute to standards documentation for database coding practices. Prioritize multiple tasks so that aggressive deadlines are met. Provide after-hours support as required to address system outages and perform system updates/upgrades/modifications. Perform other duties as required. Essential traits: B. S. Degree in Information Technology, Computer Science or a related discipline is preferred. MCSE, AWS Certified 8+ years of experience in RDBMS Systems, with at least 5 years of database development and support in MS SQL Server environments 3+ years working experience in deploying physical and cloud instances of SQL Server. (AWS preferred) and scripting languages like PowerShell, VB Script, Python, or similar. Experience designing data storage solutions outside of the RDBMS stack including non-relational DB engines and techniques Understanding of JSON and XML data storage fields and extraction. Bulk upload techniques and optimizations Power Bi or similar tools Experience in troubleshooting and resolving database integrity issues, performance issues, blocking & deadlocking issues, replication/log shipping issues, network/connectivity issues, security issues, etc. Experience tuning queries and use of optimization tools like query analyzer, and SQL Monitor, Redgate, New Relic or other APM, and other related tools Ability to detect and troubleshoot SQL Server related hardware resource contention such as CPU, memory, disk IO, etc. Solid acquaintance with windows server, security models, and storage components such as SAN. Knowledge of High Availability (HA) and Disaster Recovery (DR) options for SQL Server or similar AWS DB platforms Advanced experience with TSQL language and stored procedures, transactional replication, multi node clustering, mirroring, etc. A broad understanding of enterprise-class technologies Extensive knowledge and experience with large scale client facing online systems, Amazon AWS technologies and Windows based technologies Ability to communicate and to interact effectively with co-workers and customers is required. Ability to work and adapt in a dynamic environment and recognize priority issues, escalating accordingly Experience with creating standards and documentation Ability to work both independently or as part of a project group, with time constraints Strong customer service oriented; organizational and communication skills, positive and can-do attitude About Kroll In a world of disruption and increasingly complex business challenges, our professionals bring truth into focus with the Kroll Lens. Our sharp analytical skills, paired with the latest technology, allow us to give our clients clarity not just answers in all areas of business. We value the diverse backgrounds and perspectives that enable us to think globally. As part of One team, One Kroll, you ll contribute to a supportive and collaborative work environment that empowers you to excel. Kroll is the premier global valuation and corporate finance advisor with expertise in complex valuation, disputes and investigations, M&A, restructuring, and compliance and regulatory consulting. Our professionals balance analytical skills, deep market insight and independence to help our clients make sound decisions. As an organization, we think globally and encourage our people to do the same. Kroll is committed to equal opportunity and diversity, and recruits people based on merit. #LI-AT1 #Naukri

Posted 3 months ago

Apply

5 - 10 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

At RemoteStar , were currently hiring for one of our clients : About Client: A FTSE 250 global fintech company headquartered in London with a presence in 18 countries and five continents. Their award-winning products and platforms empower go-getters around the world giving them access to over 19,000 financial markets. Job title : Sr. Data Engineer - Data Modeler What you ll do -The broad scope of responsibility of this role requires a solid experience and understanding of data models, data analysis, data management, and programming languages to develop, analyse, and maintain data models. This role involves working closely with data architects, business analysts, and data consumers to ensure that data models meet business requirements and are aligned with industry best practices. You will: -Create and implement logical and physical data models that support structured and unstructured data across RDBMS and big data platforms. -Design and implement semantic data models that optimise data accessibility, performance, and usability for Business Intelligence (BI) and analytics. -Create and maintain documentation of data models, including entity-relationship diagrams (ERDs), data flow diagrams (DFDs), and data dictionaries to ensure clarity and alignment across teams. -Develop and maintain metadata repositories and data lineage documentation to track data transformations and dependencies. -Work closely with business analysts, data engineers, and data consumers to gather requirements and translate business needs into data models. -Own the Model governance for any changes to the model from users. -Partner with data architects to ensure data models conform to the data strategy and best practices in data management and integration. Work with Architecture teams to expand the existing models. -Conduct regular data model reviews to incorporate changes in data platforms, evolving business requirements, and performance optimisations. -Ensure all data models comply with data quality, security, and governance standards. -Analyse data lineage from source systems to the final semantic layer, identifying transformation rules and their impact on the physical data model. -Create data models that support migrations, conversions, and transformations as data platforms evolve and new standards are defined. What you ll need for this role -7 -10 years of experience as a Data Modeler or in a similar role, with expertise in data modelling principles (conceptual, logical, and physical). -5+ years of hands-on experience in data modelling, databases, and data analysis. -Strong understanding of data modelling concepts, including normalization, denormalization, dimensional modelling, and Data Vault 2.0 methodology (certification or hands-on experience preferred). -Proficiency in data modelling tools such as ER/Studio, ERwin, SQLDBMS, or Sparx Systems. -Hands-on experience with relational databases (RDBMS) such as PostgreSQL, Oracle, SQL Server, MySQL and data warehousing solutions. -Knowledge of data integration, ETL/ELT workflows, and data migration. -Experience in Semantic Data Modelling, including Business Intelligence layer data modelling and familiarity with data visualization tools. -Strong understanding of database management systems (DBMS), SQL, and query optimisation. -Familiarity with programming languages like SQL, Python, and a strong understanding of data warehousing and ETL techniques. -Experience in metadata management, data privacy standards, and compliance. -Expert knowledge of cloud-based data platforms (AWS, Azure, or GCP) and data Lakehouse architectures (Medallion architecture, Delta Lake, Snowflake, or Databricks).

Posted 3 months ago

Apply

8 - 12 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

R360 powers a data-driven customer engagement ecosystem that enables the world s leading organisations to earn customer loyalty every day. For over 10 years, R360 s loyalty and reward platforms have driven ambitious loyalty programs for some of the world s biggest brands - including Axis Bank, Standard Chartered Bank, HDFC Bank and Emirates National Dubai Bank. We have 500+ global retail partnerships and offer our clients reward programs, nuanced customer segmentation, data-centric campaigns, and big data analytics. Our multidisciplinary team of technology experts, product engineers, data scientists, client relationship managers and customer experience professionals work together with our clients to enable loyalty by enhancing customer acquisition, engagement and retention. Our values - Customer-centricity, Integrity, Excellence, Empowerment, Inclusivity, Accountability and Continuous learning- these aren t just words; they are our compass. What you ll do: Design and develop a large and scalable API hub/gateway to host different types of microservices. Lead a team of developers who will build the API hub and the hosted microservices. Work with different platforms teams to facilitate usage/integration of the APIs from the API hub. Collaborate with partners (large digital ecosystems) to understand the requirements, design and develop different microservices. Work closely with the infrastructure and security team to define a secure and scalable infrastructure. Hire and nurture a strong and smart team of developers. What we ll need: 7+ years of experience in building complex, highly scalable, high volume, low latency enterprise applications using Java. 7+ years in both backend and front end technologies primarily Java and Angular Strong experience in building microservices using technologies like Spring Boot, Spring Cloud Deep understanding of microservice design patterns, service registry and discovery, externalization of configurations. Experience in message streaming and processing technologies such as Kafka, Spark, Storm etc. Experience with one or more reactive microservices tools and techniques such as Akka, Vertex, ReactiveX. Strong experience in the creation, management and consumption of REST APIs leveraging Swagger, Postman, API Gateways (such as MuleSoft, Apigee) Good knowledge in data modelling, querying, performance tuning of any big-data stores (MongoDB, Elasticsearch Redis etc) and /or any RDBMS (Oracle, PostgreSQL, MySQL etc;) Experience working with Agile / Scrum-based teams that utilize Continuous Integration/Continuous Delivery processes using Git, Maven, Jenkins etc; Experience in Containers (Docker/Kubernetes) based deployment and management Experience in using AWS/GCP/Azure-based cloud infrastructure Knowledge in security frameworks, concepts and technologies like Spring Security, OAuth2, SAML, SSO, Identity and Access Management Whats in it for you You get to have a high ownership role in a company that is bootstrapped, profitable and in the scale up journey. You get to mentor junior engineers in the team while you accelerate your career!

Posted 3 months ago

Apply

4 - 6 years

6 - 8 Lacs

Gurgaon

Work from Office

Naukri logo

About The Role : Job Title:Sustenance And Transformation Engineer (Oracle) Your Role As an Oracle expert with Capgemini, you will be involved in providing SQL and PL/SQL coding expertise includes writing functions, procedures, and packages; reviewing, analysing and testing codes for database performance enhancement. In this role you will play a key role in: Leveraging your extensive experience in Providing on-going support and maintenance of applications using Oracle SQL, PL/SQL and Oracle Forms Conduct Code design activities for larger projects using Oracle SQL, PL/ SQL and Oracle Forms Utilizing your expertise of Oracle performance tuning / optimisation Participate in problem investigation, redevelopment, testing and deployment. Skills Oracle PL/SQL Very good understanding of DBMS, RDBMS Experience in Data Modeling for OLTP and OLAP Hand-on experience in PL/SQL development and writing complex queries Creating database objects Tables, Materialized views, Jobs Work on PL/SQL database objects, SQL scripts, interfaces, reports and forms using standard tools including Oracle Forms, Report Builder, Oracle Application Framework, BI Publisher Experience in design, development, support and maintenance

Posted 3 months ago

Apply

5 - 8 years

20 - 22 Lacs

Pune

Work from Office

Naukri logo

At BNY, our culture empowers you to grow and succeed. As a leading global financial services company at the center of the world s financial system we touch nearly 20% of the world s investible assets. Every day around the globe, our 50,000+ employees bring the power of their perspective to the table to create solutions with our clients that benefit businesses, communities and people everywhere. We continue to be a leader in the industry, awarded as a top home for innovators and for creating an inclusive workplace. Through our unique ideas and talents, together we help make money work for the world. This is what #LifeAtBNY is all about. We re seeking a future team member for the role of Senior Associate Fullstack developer to join our Markets Engineering team. This role is in Pune In this role, you ll make an impact in the following ways: Develop software artifacts using Agile methodology and extend ownership by providing production support. Design, implement and maintain Java-based applications that can be high-volume and low-latency. Maintain software functionality. Engage with Business Analysts to understand platform requirements. Actively participate in code reviews. Integrate software components into a fully functional software system. Apply security and privacy design principles To be successful in this role, we re seeking the following: Bachelors degree in computer science or a related discipline, or equivalent work experience required, with advanced degree preferred Experience working with Java Spring as back-end and Angular as Front end is must. 5-8 years of experience as Java developer/programmer, using a specific application development toolkit and knowledge of front end and backend development coding languages (Java 11.0, Spring 5, Spring boot 3 , HTML, CSS, JSON, Angular 10+, JavaScript) Should be comfortable with the following technology stack, tools and processes: ORDMS (Hibernate or Ibatis), RDBMS (Oracle), SQL, No-SQL or unstructured database will be a plus. Understanding of JVM, memory usage, performance testing and tuning. Understand nuances of architecture; Familiarity with different design and architectural patterns like of MVC (Model-View-Controller) Pattern, JDBC (Java Database Connectivity) Must have RESTful web service experience. Understand microservices based, scalable architecture (previous experience working with Kafka). Experience implementing caching (using Hazelcast/ ehcache / Memcached/others) will be plus. Strong experience in SDLC, DevOps processes - CI/CD tools, Git, etc. Should be capable to look into the production issue by checking code Splunk logs At BNY, our culture speaks for itself. Here s a few of our awards: America s Most Innovative Companies, Fortune, 2024 World s Most Admired Companies, Fortune 2024 Human Rights Campaign Foundation, Corporate Equality Index, 100% score, 2023-2024 Best Places to Work for Disability Inclusion , Disability: IN - 100% score, 2023-2024 Most Just Companies , Just Capital and CNBC, 2024 Dow Jones Sustainability Indices, Top performing company for Sustainability, 2024 Bloomberg s Gender Equality Index (GEI), 2023 Our Benefits and Rewards: BNY offers highly competitive compensation, benefits, and wellbeing programs rooted in a strong culture of excellence and our pay-for-performance philosophy. We provide access to flexible global resources and tools for your life s journey. Focus on your health, foster your personal resilience, and reach your financial goals as a valued member of our team, along with generous paid leaves, including paid volunteer time, that can support you and your family through moments that matter. BNY is an Equal Employment Opportunity/Affirmative Action Employer - Underrepresented racial and ethnic groups/Females/Individuals with Disabilities/Protected Veterans.

Posted 3 months ago

Apply

12 - 19 years

27 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Provide On premise Cloud data architectural solutions/designs to project execution teams for implementation. Design Architect MS SQL Database Management Solution - MS SQL Server DB, SSIS ETL, SSRS Reporting Services and Power BI Provide architectural assessments, strategies, and roadmaps for data management Architect design data migration patterns from RDBMS to cloud data warehouse Own and aggressively drive forward specific areas of AWS technology data architecture. This includes data management architecture involving batch, micro batch, and real time streaming of data in both cloud and on premises solutions Design knowledge of data modelling, ELT, MS SSIS / ADF ETLs, Spark Python along with Strong SQL knowledge and strong DWH and ETL concepts Support multiple Agile Scrum teams with planning, scoping and creation of technical solutions for the new product capabilities, through continuous delivery to production. Liaise with the delivery team and clients for res

Posted 3 months ago

Apply

4 - 6 years

17 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

HARMAN s engineers and designers are creative, purposeful and agile. As part of this team, you ll combine your technical expertise with innovative ideas to help drive cutting-edge solutions in the car, enterprise and connected ecosystem. Every day, you will push the boundaries of creative design, and HARMAN is committed to providing you with the opportunities, innovative technologies and resources to build a successful career. A Career at HARMAN As a technology leader that is rapidly on the move, HARMAN is filled with people who are focused on making life better. Innovation, inclusivity and teamwork are a part of our DNA. When you add that to the challenges we take on and solve together, you ll discover that at HARMAN you can grow, make a difference and be proud of the work you do everyday. 5+ years of experience working as a Java Developer. Developer with expertise in Java/J2EE technologies, RDBMS, continuous integration/deployment environment Key Technologies: Java Enterprise JDK, Vaadin, Maven, HTML5 Javascript, SQL, Eclipse or IntelliJ IDE, Git, XML processing using SAX, StAX, XSLT, GWT, jQuery, Typescript or similar toolkits, Unix Minimum 5 years of software engineering/development experience, specifically at least 4+ years of experience in design and development of highly scalable Java/J2EE Vaadin-based web applications Experience with various integration and deployment tools Experience working in a distributed computing and SOA Experience with object-oriented design and analysis Experience with the complete software development lifecycle; experience with agile methodologies. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer . All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Posted 3 months ago

Apply

5 - 10 years

20 - 23 Lacs

Bengaluru

Work from Office

Naukri logo

HARMAN s engineers and designers are creative, purposeful and agile. As part of this team, you ll combine your technical expertise with innovative ideas to help drive cutting-edge solutions in the car, enterprise and connected ecosystem. Every day, you will push the boundaries of creative design, and HARMAN is committed to providing you with the opportunities, innovative technologies and resources to build a successful career. A Career at HARMAN As a technology leader that is rapidly on the move, HARMAN is filled with people who are focused on making life better. Innovation, inclusivity and teamwork are a part of our DNA. When you add that to the challenges we take on and solve together, you ll discover that at HARMAN you can grow, make a difference and be proud of the work you do everyday. Proficient in java, Spring boot, microservices and Kafka PostgreSQL or any RDBMS Strong problem-solving skills, with relevant experience in implementing large-scale distributed backend services. Stays current with new and evolving technologies via self-directed education Excellent written and verbal communication skills, including the ability to write detailed technical documents. Technically mentoring a team of talented engineers on all aspects. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer . All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Posted 3 months ago

Apply

3 - 5 years

30 - 35 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

Design and develop custom reports, dashboards, and MIS reports for banking operations. Work extensively with T24 Data Model to extract, transform, and analyze banking data. Develop complex SQL queries, stored procedures, views, and triggers in MS SQL Server . Ensure data integrity and accuracy by performing data validation and reconciliation . Optimize report performance by implementing indexing, query optimization, and database tuning techniques . Technical Expertise: Strong expertise in MS SQL Server, RDBMS, and SQL scripting. Deep understanding of the T24 Data Model, T24 tables, and reporting structure. Knowledge of ETL tools, Data Warehousing, and Business Intelligence (BI) tools (Oracle Analytics Server is a plus). Experience with performance tuning, indexing, and database optimization techniques . Functional Expertise: Strong understanding of banking products (Loans, Deposits, Treasury, Payments, etc.). Experience working with regulatory and financial reports in banking. Knowledge of Temenos reporting frameworks and compliance requirements . Other Skills: Analytical mindset with strong problem-solving skills . Excellent communication and stakeholder management skills . Ability to work independently and in a collaborative environment. Technical Responsibilities: Extract data from T24 Core Banking System . Develop and maintain ETL (Extract, Transform, Load) processes for report automation. Functional Responsibilities: Understand banking processes, financial reporting, and regulatory compliance requirements. Collaborate with business users to define reporting needs and deliver actionable insights. Configure T24 Reporting Framework and customize standard reporting functionalities. Ensure reports align with banking regulations and audit requirements . Provide support for UAT (User Acceptance Testing) and production deployments .

Posted 3 months ago

Apply

5 - 10 years

19 - 21 Lacs

Pune

Work from Office

Naukri logo

Write test scenario test cases for Enterprise Application, within schedule and within estimated efforts. Provide estimate for the assigned task Write automated component, Integration E2E test cases Review unit test cases written by developers and fix minor code defects Provide accurate status of the tasks Perform peer review of automated test cases and mentor junior team members Comply with organizations processes. Policies and protects organization s Intellectual property. Also, participate in organization level process improvement and knowledge sharing All About You Essential knowledge, skills attributes Proficient in analysing and testing mainframe jobs, COBOL programs. Hands on experience of writing Test scenario Test cases for Enterprise applications Hands on experience with core Java, Junit, JBehave, Spring Boot, SQL, RDBMS (Oracle and PostGRES), NoSQL (Cassandra), Web-services (JSON and SOAP) tools like Postman, SOAPUI Hands on experience of testing microservice application API testing Strong understanding of different test frameworks Selenium, Rest Assured, BDD with hands on experience of developing automation test frameworks Experience of working with Agile methodologies. Personal attributes are strong logical and Analytical Skills, should be able to articulate and present his/her thoughts very clearly and precisely in English (written and verbal) Knowledge of Security concepts (E.g. authentication, authorization, confidentiality etc.) and protocols, their usage in enterprise application Additional/Desirable capabilities Experience of working in Payments application Domain Hands on experience of working with tools like Mockito, JBehave, Jenkins, Bamboo, Confluence, Rally Jira.

Posted 3 months ago

Apply

10 - 12 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

Role Description An Engineer is responsible for designing, developing and delivering significant components of engineering solutions to accomplish business goals efficiently and reliably. Key responsibilities of this role include active participation in the design of their solution components, investigating re-use, ensuring that solutions are fit for purpose, reliable, maintainable, and can be integrated successfully into the overall solution and environment with clear, robust and well tested deployments. Engineers actively look for opportunities to improve the availability and performance of components by applying the learning from monitoring and observation, automating towards zero touch, and championing a 'DevOps' mind-set. Your key responsibilities Hands-on software development and will be primarily responsible for creating good quality requirement specification and high-level design of reporting workflow Should be able to contribute towards good software design System Integration testing of developed software Do requirement specification review of other team members Participate and manage daily stand-up meetings Participate in Agile Scrum ceremonies Articulate issues and risks to team leads in timely manner This role will require 50% Technical & 50% Functional involvement on other activities like team handling, mentoring Analyse software defects and fix them in timely manner Work closely with Functional Analysis and Quality Assurance teams and other developers in the team for completion of task in hand Your skills and experience IT Exp of 10+ yrs preferable with 3~4 yrs of relevant experience in regulatory reporting. Axiom CV V9/10 experience is an added advantage. Must have Proficiency in RDBMS (Oracle 12C, 18g or 19c or any RDBMS database application) 10-12 year experience in below functional domains - Regulatory Reporting (preferred), Finance, Accounting, Derivatives, Trade Life Cycle, Risk Management, Capital Markets, Investment Banking Financial data modelling & analysis -CB/IB products Proficiency in performance tuning of SQL queries. Working knowledge of Unix shells scripting. Hands on Experience of IT Business Analyst role involving; Requirement Gathering, Data Onboarding / Sourcing, Data Analysis, Requirement documentation, User Acceptance Testing Hands on experience of handling Local Reg Reporting requirements for Finance or Operations including products like FX, Derivatives, Bonds, Repos, Loans, Deposits, Trade Finance etc. Good understanding of the complete Trade Lifecycle for the above-mentioned products Experience of working on any of the Local Regulatory Reporting requirements Knowledge of Financial Statements like Balance Sheet, Income & Loss Statement, Cashflow Statement along with other regular reports like EMIR, MIFID, DFA, CCAR, Liquidity Coverage Ratio, Large Exposures Reporting, Non-Performing Assets Reporting etc. Preferable to have the additional experience as below Experience of Taxonomy Reporting using Axiom Experience of working on Axiom v10 architecture Working experience in migration from v9 to v10 Axiom architecture Experience in Google Cloud platform, Python, micro services Agile methodology delivery experience Liquidity - LCR implementation Risk - BASELimplementation Good Understanding of Post Trade & Settlement Processes along with accounting principles and standards for reporting like GAAP, IFRS etc.

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Entity :- Accenture Strategy & Consulting Team :- Strategy & Consulting – Global Network Practice :- Marketing Analytics Title :- Data Science Consultant Job location :- Bengaluru, Chennai, Mumbai, Pune, Gurugram, Hyderabad About S&C - Global Network (GN) :- Accenture GN's Data & AI (AI Hub India) practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling – to outperform the competition. WHAT'S IN IT FOR YOU? As part of our Analytics practice, you will join a worldwide network of over 20k+ smart and driven colleagues experienced in leading AI/ML/Statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. What you would do in this role A Consultant/Manager for Customer Data Platforms serves as the day-to-day marketing technology point of contact and helps our clients get value out of their investment into a Customer Data Platform (CDP) by developing a strategic roadmap focused on personalized activation. You will be working with a multidisciplinary team of Solution Architects, Data Engineers, Data Scientists, and Digital Marketers. Key Duties and Responsibilities: Be a platform expert in one or more leading CDP solutions. Developer level expertise on Lytics, Segment, Adobe Experience Platform, Amperity, Tealium, Treasure Data etc. Including custom build CDPs Deep developer level expertise for real time even tracking for web analytics e.g., Google Tag Manager, Adobe Launch etc. Provide deep domain expertise in our client's business and broad knowledge of digital marketing together with a Marketing Strategist industry Deep expert level knowledge of GA360/GA4, Adobe Analytics, Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Assess and audit the current state of a client's marketing technology stack (MarTech) including data infrastructure, ad platforms and data security policies together with a solutions architect. Conduct stakeholder interviews and gather business requirements Translate business requirements into BRDs, CDP customer analytics use cases, structure technical solution Prioritize CDP use cases together with the client. Create a strategic CDP roadmap focused on data driven marketing activation. Work with the Solution Architect to strategize, architect, and document a scalable CDP implementation, tailored to the client's needs. Provide hands-on support and platform training for our clients. Data processing, data engineer and data schema/models expertise for CDPs to work on data models, unification logic etc. Work with Business Analysts, Data Architects, Technical Architects, DBAs to achieve project objectives - delivery dates, quality objectives etc. Business intelligence expertise for insights, actionable recommendations. Project management expertise for sprint planning Qualifications Who we are looking for? B Tech/M Tech from reputed engineering colleges Masters/M Tech in Computer Science Master degree in Statistics/Econometrics/ Economics from reputed institute M.Phil/Ph.D in Statistics/Econometrics or related field At least 4+ years of relevant work experience of implementing CDP solutions with deep understanding of identity resolution methods At least 4+ years of relevant work experience in marketing, consulting, or analytics At least 5+ years of experience working in an agency environment At least 3 years of experience working with multi-channel marketing hubs by Adobe Experience Platform, Salesforce, Pega or similar. Strong understanding of data governance and compliance (i.e. PII, PHI, GDPR, CCPA) Experience with analytics tools like Google Analytics or Adobe Analytics is a plus. Experience with A/B testing tools is a plus. Must have programming experience in PySpark, Python, Shell Scripts. RDBMS, TSQL, NoSQL experience is must. Manage large volumes of structured and unstructured data, extract & clean data to make it amenable for analysis. Experience in deployment and operationalizing the code is an added advantage. Experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools. Proficient in Excel, MS word, PowerPoint, etc Technical Skills: Any CDP platforms experience e.g., Lytics CDP platform developer, or/and Segment CDP platform developer, or/and Adobe Experience Platform (Real time – CDP ) developer, or/and Custom CDP developer on any cloud GA4/GA360, or/and Adobe Analytics Google Tag Manager, and/or Adobe Launch, and/or any Tag Manager Tool Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Deep Cloud experiecne (GCP, AWS, Azure) Advance level Python, SQL, Shell Scripting experience Data Migration, DevOps, MLOps, Terraform Script programmer Soft Skills: Strong problem solving skills Good team player Attention to details Good communication skills Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Gurgaon

Work from Office

Naukri logo

Entity :- Accenture Strategy & Consulting Team :- Strategy & Consulting – Global Network Practice :- Marketing Analytics Title :- Data Science Consultant Job location :- Bengaluru, Chennai, Mumbai, Pune, Gurugram, Hyderabad About S&C - Global Network (GN) :- Accenture GN's Data & AI (AI Hub India) practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling – to outperform the competition. WHAT'S IN IT FOR YOU? As part of our Analytics practice, you will join a worldwide network of over 20k+ smart and driven colleagues experienced in leading AI/ML/Statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. What you would do in this role A Consultant/Manager for Customer Data Platforms serves as the day-to-day marketing technology point of contact and helps our clients get value out of their investment into a Customer Data Platform (CDP) by developing a strategic roadmap focused on personalized activation. You will be working with a multidisciplinary team of Solution Architects, Data Engineers, Data Scientists, and Digital Marketers. Key Duties and Responsibilities: Be a platform expert in one or more leading CDP solutions. Developer level expertise on Lytics, Segment, Adobe Experience Platform, Amperity, Tealium, Treasure Data etc. Including custom build CDPs Deep developer level expertise for real time even tracking for web analytics e.g., Google Tag Manager, Adobe Launch etc. Provide deep domain expertise in our client's business and broad knowledge of digital marketing together with a Marketing Strategist industry Deep expert level knowledge of GA360/GA4, Adobe Analytics, Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Assess and audit the current state of a client's marketing technology stack (MarTech) including data infrastructure, ad platforms and data security policies together with a solutions architect. Conduct stakeholder interviews and gather business requirements Translate business requirements into BRDs, CDP customer analytics use cases, structure technical solution Prioritize CDP use cases together with the client. Create a strategic CDP roadmap focused on data driven marketing activation. Work with the Solution Architect to strategize, architect, and document a scalable CDP implementation, tailored to the client's needs. Provide hands-on support and platform training for our clients. Data processing, data engineer and data schema/models expertise for CDPs to work on data models, unification logic etc. Work with Business Analysts, Data Architects, Technical Architects, DBAs to achieve project objectives - delivery dates, quality objectives etc. Business intelligence expertise for insights, actionable recommendations. Project management expertise for sprint planning Qualifications Who we are looking for? B Tech/M Tech from reputed engineering colleges Masters/M Tech in Computer Science Master degree in Statistics/Econometrics/ Economics from reputed institute M.Phil/Ph.D in Statistics/Econometrics or related field At least 4+ years of relevant work experience of implementing CDP solutions with deep understanding of identity resolution methods At least 4+ years of relevant work experience in marketing, consulting, or analytics At least 5+ years of experience working in an agency environment At least 3 years of experience working with multi-channel marketing hubs by Adobe Experience Platform, Salesforce, Pega or similar. Strong understanding of data governance and compliance (i.e. PII, PHI, GDPR, CCPA) Experience with analytics tools like Google Analytics or Adobe Analytics is a plus. Experience with A/B testing tools is a plus. Must have programming experience in PySpark, Python, Shell Scripts. RDBMS, TSQL, NoSQL experience is must. Manage large volumes of structured and unstructured data, extract & clean data to make it amenable for analysis. Experience in deployment and operationalizing the code is an added advantage. Experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools. Proficient in Excel, MS word, PowerPoint, etc Technical Skills: Any CDP platforms experience e.g., Lytics CDP platform developer, or/and Segment CDP platform developer, or/and Adobe Experience Platform (Real time – CDP ) developer, or/and Custom CDP developer on any cloud GA4/GA360, or/and Adobe Analytics Google Tag Manager, and/or Adobe Launch, and/or any Tag Manager Tool Google Ads, DV360, Campaign Manager, Facebook Ads Manager, The Trading desk etc. Deep Cloud experiecne (GCP, AWS, Azure) Advance level Python, SQL, Shell Scripting experience Data Migration, DevOps, MLOps, Terraform Script programmer Soft Skills: Strong problem solving skills Good team player Attention to details Good communication skills Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.

Posted 3 months ago

Apply

4 - 8 years

6 - 10 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

What youll be doing... You will be part of a World Class Container Platform team that builds and operates highly scalable Kubernetes based container platforms(EKS, OCP, OKE and GKE) at a large scale for Global Technology Solutions at Verizon, a top 20 Fortune 500 company. This individual will have a high level of technical expertise and daily hands-on implementation working in a product team developing services in two week sprints using agile principles. This entitles programming and orchestrating the deployment of feature sets into the Kubernetes CaaS platform along with building Docker containers via a fully automated CI/CD pipeline utilizing AWS, Jenkins Ansible playbooks, AWS, CI/CD tools and process ( Jenkins, JIRA, GitLab, ArgoCD), Python, Shell Scripts or any other scripting technologies. You will have autonomous control over day-to-day activities allocated to the team as part of agile development of new services. Automation and testing of different platform deployments, maintenance and decommissioning Full Stack Development Participate in POC(Proof of Concept) technical evaluations for new technologies for use in the cloud What were looking for... Youll need to have: Bachelors degree or four or more years of experience. GitOpsCI/CD workflows (ArgoCD, Flux) and Working in Agile Ceremonies Model Address Jira tickets opened by platform customers Strong Expertise of SDLC and Agile Development Experience in Design, develop and implement scalable React/Node based applications (Full stack developer) Experience with development with HTTP/RESTful APIs, Microservices Experience with Serverless Lambda Development, AWS Event Bridge, AWS Step Functions, DynamoDB, Python Database experience (RDBMS, NoSQL, etc.) Familiarity integrating with existing web application portals Strong backend development experience with languages to include Golang (preferred), Spring Boot and Python. Experience with GitLab CI/CD, Jenkins, Helm, Terraform, Artifactory Strong Development of K8S tools/components which may include standalone utilities/plugins, cert-manager plugins, etc. Development and Working experience with Service Mesh lifecycle management and configuring, troubleshooting applications deployed on Service Mesh and Service Mesh related issues Strong Terraform and/or Ansible and Bash scripting experience Effective code review, quality, performance tuning experience, test Driven Development Certified Kubernetes Application Developer (CKAD) Excellent cross collaboration and communication skills Even better if you have one or more of the following: Working experience with security tools such as Sysdig, Crowdstrike, Black Duck, Xray, etc. Experience with OWASP rules and mitigating security vulnerabilities using security tools like Fortify, Sonarqube, etc. Experience with monitoring tools like NewRelic (NRDOT), OTLP Certified Kubernetes Administrator (CKA) Certified Kubernetes Security Specialist (CKS) Red Hat Certified OpenShift Administrator Development Experience with the Operator SDK Experience creating validating and/or mutating webhooks Familiarity with creating custom EnvoyFilters for Istio service mesh and cost optimization tools like Kubecost, CloudHealth to implement right sizing recommendations

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP Basis Administration Good to have skills : Microsoft SQL Server Administration, Cloud Technologies, Advanced B, Solution Manager, BOBJ Minimum 5 year(s) of experience is required Educational Qualification : 15 years of full time education is required Summary :As a Software Development Engineer, you will be responsible for analyzing, designing, coding, and testing multiple components of application code across one or more clients. Your typical day will involve performing maintenance, enhancements, and/or development work on SAP BASIS Administration. Roles & Responsibilities: Lead the design, implementation, and maintenance of SAP BASIS Administration for multiple clients. Collaborate with cross-functional teams to ensure the smooth functioning of SAP BASIS Administration. Perform maintenance, enhancements, and/or development work on SAP BASIS Administration. Ensure the security, performance, and availability of SAP BASIS Administration. Provide technical support and guidance to clients on SAP BASIS Administration. Professional & Technical Skills: Must To Have Skills:Strong experience in SAP BASIS Administration. Expertise in hosting SAP NWA application on Windows Server. RDBMS Knowledge – Expertise in MSSQL DB. SAP JAVA and Portal Administration. SAP NWA System Installation on Windows server, Upgrade, System copy/Refresh. Good Communication Skills– Client, Verbal and Written. Ticketing Tools, Third party tools, SAP Integration tools. Support 16/7 shift. Good To Have Skills:Operating System MS-SQL, BOBJ Expertise in SAP NetWeaver, SAP HANA, and SAP Solution Manager. Experience in SAP system installation, upgrade, and patching. Knowledge of SAP security and authorization concepts. Experience in SAP system monitoring, performance tuning, and troubleshooting. Additional Information: The candidate should have a minimum of 7-8 years of experience in SAP BASIS Administration. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Bengaluru office. Qualifications 15 years of full time education is required

Posted 3 months ago

Apply

2 - 6 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Design, implement, and manage large-scale data processing systems using Big Data Technologies such as Hadoop, Apache Spark, and Hive. Develop and manage our database infrastructure based on Relational Database Management Systems (RDBMS), with strong expertise in SQL. Utilize scheduling tools like Airflow, Control M, or shell scripting to automate data pipelines and workflows. Write efficient code in Python and/or Scala for data manipulation and processing tasks. Leverage AWS services including S3, Redshift, and EMR to create scalable, cost-effective data storage and processing solutions Required education Bachelor's Degree Required technical and professional expertise Proficiency in Big Data Technologies, including Hadoop, Apache Spark, and Hive. Strong understanding of AWS services, particularly S3, Redshift, and EMR. Deep expertise in RDBMS and SQL, with a proven track record in database management and query optimization. Experience using scheduling tools such as Airflow, Control M, or shell scripting. Practical experience in Python and/or Scala programming languages Preferred technical and professional experience Knowledge of Core Java (1.8 preferred) is highly desired Excellent communication skills and a willing attitude towards learning. Solid experience in Linux and shell scripting. Experience with PySpark or Spark is nice to have Familiarity with DevOps tools including Bamboo, JIRA, Git, Confluence, and Bitbucket is nice to have Experience in data modelling, data quality assurance, and load assurance is a nice-to-have

Posted 3 months ago

Apply

2 - 6 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Design, implement, and manage large-scale data processing systems using Big Data Technologies such as Hadoop, Apache Spark, and Hive. Develop and manage our database infrastructure based on Relational Database Management Systems (RDBMS), with strong expertise in SQL. Utilize scheduling tools like Airflow, Control M, or shell scripting to automate data pipelines and workflows. Write efficient code in Python and/or Scala for data manipulation and processing tasks. Leverage AWS services including S3, Redshift, and EMR to create scalable, cost-effective data storage and processing solutions Required education Bachelor's Degree Required technical and professional expertise Proficiency in Big Data Technologies, including Hadoop, Apache Spark, and Hive. Strong understanding of AWS services, particularly S3, Redshift, and EMR. Deep expertise in RDBMS and SQL, with a proven track record in database management and query optimization. Experience using scheduling tools such as Airflow, Control M, or shell scripting. Practical experience in Python and/or Scala programming languages Preferred technical and professional experience Knowledge of Core Java (1.8 preferred) is highly desired Excellent communication skills and a willing attitude towards learning. Solid experience in Linux and shell scripting. Experience with PySpark or Spark is nice to have Familiarity with DevOps tools including Bamboo, JIRA, Git, Confluence, and Bitbucket is nice to have Experience in data modelling, data quality assurance, and load assurance is a nice-to-have

Posted 3 months ago

Apply

10 - 15 years

50 - 55 Lacs

Pune

Work from Office

Naukri logo

Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in multiple areas predominantly in Cloud / Hybrid Architecture. Your key responsibilities As part of this Role, we are seeking advanced expertise in Python and SQL based Data Engineering. Strong Experience in API development using python-based frameworks. Strong Experience in building highly scalable and reusable python frameworks. Highly experienced in RDBMS (Oracle, Postgres, Big Query) and ETL/ELT concepts. Highly experienced in using orchestration tools (Airflow). This senior role, you will be a trusted advisor, providing technical expertise and strategic direction across all things in data engineering. Technical Expertise: Must have development experience using Python, SQL. Design and optimize complex data pipelines for efficient data ingestion, transformation, and analysis. Partner with product management group and other business stakeholders to gather requirements, translate them into technical specifications, and design effective reusable python driven frameworks. Design and develop complex data models, leveraging expertise in relational and dimensional modeling techniques. Advocate for best practices of Data Engineering. Collaboration & Mentorship : Collaborate with data engineers, analysts, and business stakeholders to understand data requirements and drive data-driven decision-making. Mentor and guide junior team members on Data Engineering technologies. Foster a culture of innovation and continuous improvement within the data and BI domain. Staying Current : Track emerging trends and innovations in Data Engineering methodologies. Proactively research and recommend new technologies and solutions to enhance our data capabilities. Your skills and experience 10 years of experience in data warehousing, data management, and data engineering. Must have ETL/ELT process building using Python and strong Python scripting knowledge. Must have experience architecting and building scalable data frameworks. Experience designing and implementing complex data pipelines. In-depth knowledge of relational and dimensional modeling techniques for BI. Experience with PL SQL. Strong Experience in API development and Data Extraction from API using python-based frameworks. Experience in Data integration concepts. Experience with DevOps tools like Git, Jenkins. Excellent communication, collaboration, and problem-solving skills. Ability to translate technical concepts into clear, actionable insights for business stakeholders. Strong leadership presence and ability to influence and inspire others. Knowledge of Sustainable Finance / ESG Risk / CSRD / Regulatory Reporting will be a plus Knowledge of GCP cloud infrastructure and data governance best practices will be a plus. Knowledge of Terraform will be a plus.

Posted 3 months ago

Apply

10 - 12 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

Role Description An Engineer is responsible for designing, developing and delivering significant components of engineering solutions to accomplish business goals efficiently and reliably. Key responsibilities of this role include active participation in the design of their solution components, investigating re-use, ensuring that solutions are fit for purpose, reliable, maintainable, and can be integrated successfully into the overall solution and environment with clear, robust and well tested deployments. Engineers actively look for opportunities to improve the availability and performance of components by applying the learning from monitoring and observation, automating towards zero touch, and championing a 'DevOps' mind-set. Your key responsibilities Hands-on software development and will be primarily responsible for creating good quality requirement specification and high-level design of reporting workflow Should be able to contribute towards good software design System Integration testing of developed software Do requirement specification review of other team members Participate and manage daily stand-up meetings Participate in Agile Scrum ceremonies Articulate issues and risks to team leads in timely manner This role will require 50% Technical & 50% Functional involvement on other activities like team handling, mentoring Analyse software defects and fix them in timely manner Work closely with Functional Analysis and Quality Assurance teams and other developers in the team for completion of task in hand Your skills and experience IT Exp of 10+ yrs preferable with 3-4 yrs of relevant experience in regulatory reporting. Axiom CV V9/10 experience is an added advantage. Must have Proficiency in RDBMS (Oracle 12C, 18g or 19c or any RDBMS database application) 10-12 year experience in below functional domains - Regulatory Reporting (preferred), Finance, Accounting, Derivatives, Trade Life Cycle, Risk Management, Capital Markets, Investment Banking Financial data modelling & analysis -CB/IB products Proficiency in performance tuning of SQL queries. Working knowledge of Unix shells scripting. Hands on Experience of IT Business Analyst role involving; Requirement Gathering, Data Onboarding / Sourcing, Data Analysis, Requirement documentation, User Acceptance Testing Hands on experience of handling Local Reg Reporting requirements for Finance or Operations including products like FX, Derivatives, Bonds, Repos, Loans, Deposits, Trade Finance etc. Good understanding of the complete Trade Lifecycle for the above-mentioned products Experience of working on any of the Local Regulatory Reporting requirements Knowledge of Financial Statements like Balance Sheet, Income & Loss Statement, Cashflow Statement along with other regular reports like EMIR, MIFID, DFA, CCAR, Liquidity Coverage Ratio, Large Exposures Reporting, Non-Performing Assets Reporting etc. Preferable to have the additional experience as below Experience of Taxonomy Reporting using Axiom Experience of working on Axiom v10 architecture Working experience in migration from v9 to v10 Axiom architecture Experience in Google Cloud platform, Python, micro services Agile methodology delivery experience Liquidity - LCR implementation Risk - BASEL implementation Good Understanding of Post Trade & Settlement Processes along with accounting principles and standards for reporting like GAAP, IFRS etc.

Posted 3 months ago

Apply

2 - 4 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Design, implement, and manage large-scale data processing systems using Big Data Technologies such as Hadoop, Apache Spark, and Hive. Develop and manage our database infrastructure based on Relational Database Management Systems (RDBMS), with strong expertise in SQL. Utilize scheduling tools like Airflow, Control M, or shell scripting to automate data pipelines and workflows. Write efficient code in Python and/or Scala for data manipulation and processing tasks. Leverage AWS services including S3, Redshift, and EMR to create scalable, cost-effective data storage and processing solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficiency in Big Data Technologies, including Hadoop, Apache Spark, and Hive. Strong understanding of AWS services, particularly S3, Redshift, and EMR. Deep expertise in RDBMS and SQL, with a proven track record in database management and query optimization. Experience using scheduling tools such as Airflow, Control M, or shell scripting. Practical experience in Python and/or Scala programming languages Preferred technical and professional experience Knowledge of Core Java (1.8 preferred) is highly desired Excellent communication skills and a willing attitude towards learning. Solid experience in Linux and shell scripting. Experience with PySpark or Spark is nice to have Familiarity with DevOps tools including Bamboo, JIRA, Git, Confluence, and Bitbucket is nice to have Experience in data modelling, data quality assurance, and load assurance is a nice-to-have

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies