Home
Jobs

1689 Rdbms Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

18 - 30 Lacs

South Goa, Pune

Hybrid

Naukri logo

We are looking for a Lead Python Developer with deep experience in Python and Cloud for an exciting and cutting edge stealth startup in Silicon Valley Responsibilities Team player: Delivers effectively with teams; interpersonal skills, communication skills, risk management skills Complete ownership and delivery of e2e features, production infra or ci/cd initiatives Write well designed, testable, readable & efficient code in Python Write automated unit and integration tests Own DB migrations across MongoDB and Postgres Guide QA teams towards efficient validation of high risk functional & system tests CI/CD deployments on Amazon AWS & MS Azure Cloud Platforms Optimize applications for maximum reliability, performance & scalability Drive feature implementations independently to completion Participate in requirement analysis, solution design, project planning & tracking activities Understand, contribute & evolve Numinos best practices for design & coding best practices Requirements Strong computer science fundamentals: data structures & algorithms, networking, RDBMS, and distributed computing 3-8 years of experience on Python Stack: Behave, PyTest, Python Generators & async operations, multithreading, context managers, decorators, descriptors Python frameworks: FastAPI or Flask or DJango or SQLAlchemy Deep DB expertise with experience in data modelling, complex queries: Postgres, Mongodb Expertise in REST APIs design, Authentication, Single Sign-on Expertise on GCP or AWS & MS Azure Cloud Platforms Experience in delivering solutions with high performance and scale criteria Experience in Docker, Kubernetes, Redis Cache, Microservices Experience in LINUX OSs, Web Servers & LBs: Apache, Tomcat, NGINX

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 19 Lacs

Pune

Work from Office

Naukri logo

We are seeking a Data Engineer with strong expertise in Microsoft Fabric and Databricks to support our enterprise data platform initiatives. Role: Data Engineer Microsoft Fabric & Databricks Location: Pune/ Remote Key Responsibilities: • Develop and maintain scalable data platforms using Microsoft Fabric for BI and Databricks for real-time analytics. • Build robust data pipelines for SAP, MS Dynamics, and other cloud/on-prem sources. • Design enterprise-scale Data Lakes and integrate structured/unstructured data. • Optimize algorithms developed by data scientists and ensure platform reliability. • Collaborate with data scientists, architects, and business teams in a global environment. • Perform general administration, security, and monitoring of data platforms. Mandatory Skills: • Experience with Microsoft Fabric (Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Models) and/or Databricks (Apache Spark). • Strong background in Python, SQL (Scala is a plus), and API integration. • Hands-on experience with Power BI and various database technologies (RDBMS, OLAP, Time Series). • Experience working with large datasets, preferably in an industrial or enterprise environment. • Proven skills in performance tuning, data modeling, data mining, and cloud security (Azure preferred). Nice to Have: • Knowledge of Azure data services (Storage, Networking, Billing, Security). • Experience with DevOps, agile software development, and working in international/multicultural teams. Candidate Requirements: • 4+ years of experience as a data engineer. • Bachelors or Masters degree in Computer Science, Information Systems, or related fields. • Strong problem-solving skills and a high attention to detail. • Proficiency in English (written and verbal) Please share your resume at Neesha1@damcogroup.com

Posted 1 week ago

Apply

9.0 - 13.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Strong hands on engineering lead for Credit Origination Hive in PE. This is #2 priority for the CTA program and a strong engineering talent is required to drive the rebuild of creditmate legacy platform. The skillset requires is to complete overhaul and develop an inhouse solution in latest technology stack The person will drive the solution design, architecture and execution of developing new creditmate UI aligned with CC wide Unified UI / UX strategy. This is a strategic role to provide overall technical leadership and direction encouraging innovation and improvement in the technology and data systems, processes and ways of working Key Responsibilities Strategy Advice future technology capabilities and architecture design considering business objectives, technology strategy, trends and regulatory requirements Awareness and understanding of the Group s business strategy and model appropriate to the role. Business Awareness and understanding of the wider business, economic and market environment in which the Group operates. Understand and Recommend business flows and translate them to API Ecosystem Processes Responsible for executing and supervising microservices development to facilitate business capabilities and orchestrate to achieve business outcomes People & Talent Lead through example and build the appropriate culture and values. Set appropriate tone and expectations from their team and work in collaboration with risk and control partners. Ensure the provision of ongoing training and development of people, and ensure that holders of all critical functions are suitably skilled and qualified for their roles ensuring that they have effective supervision in place to mitigate any risks. Governance Awareness and understanding of the regulatory framework, in which the Group operates, and the regulatory requirements and expectations relevant to the role Regulatory & Business Conduct Display exemplary conduct and live by the Group s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. [Fill in for regulated roles] Lead the [country / business unit / function/XXX [team] to achieve the outcomes set out in the Bank s Conduct Principles : [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment. ] [Insert local regulator e. g. PRA/FCA prescribed responsibilities and Rationale for allocation]. [Where relevant - Additionally, for subsidiaries or relevant non -subsidiaries] Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Product Owners, Hive Leads, Client Coverage Tech and Biz Stakeholders Other Responsibilities Embed Here for good and Group s brand and values in XXXX [country / business unit / team]; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); [List all responsibilities associated with the role] Technical Competence Must Have Expert in programming skills using Java & related technologies. Good hands on Java Collections and Streams Good knowledge about Design Patterns and Principles Strong experience in Developing application data models Good hands on experience on developing seamless interfaces across multiple OLTP/OLAP applications. Good knowledge on API building (Web Service, SOAP/REST) Good knowledge on Unit testing and code coverage using JUnit/Mockito Good knowledge on code quality tools like SonarQube etc. Good knowledge on multi-threading, multi-processing implementations Strong experience in RDBMS (Oracle, PostgreSQL, MySQL) Ability to work in quick paced, dynamic environment adapting agile methodologies Ability to work with minimal guidance and/or high-level design input Knowledge on Microservices based development and implementation Knowledge on CI-CD pattern with related tools like GIT, Bitbucket, Jenkins, JIRA, etc. Knowledge on JSON libraries like Jackson/GSON Knowledge on basic Unix Commands Possess good documentation and presentation skills Able to articulate ideas, designs, and suggestions Mentoring fellow team members, conducting code reviews Domain: Good to Have Experience in application development for Client Coverage Credit Modules Good knowledge on Cloud native application development, and knowledge of Cloud computing services Our ideal Candidate Standard Job Requirements 10+ Years of experience in application development using Java and related technologies Having proven track record or building global enterprise grade applications from scratch Strong understanding of fundamental architecture and design principles, object-orientation principles, and coding standards Experience in building extensible and scalable solutions Strong analytical and problem-solving skills Strong verbal and written communication skills Excellent knowledge in DevOps, CI-CD Experienced in Agile methodology and Waterfall models Experience in developing enterprise / application architecture in detail. Strong experience in Application Delivery including Production Support Ability to learn and adapt to new technologies and frameworks Awareness about Release Management Strong team player who can collaborate effectively with relevant stakeholders About Standard Chartered Were an international bank, nimble enough to act, big enough for impact. For more than 170 years, weve worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If youre looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we cant wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, youll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www. sc. com/careers 24062

Posted 1 week ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

Job : Product Support Engineer (Pune) Jobs in Pune (J49074)- Job in Pune Job Summary BE-Comp/IT, BSc-Comp/IT, BSc-Other, BTech-Comp/IT, BTech-Other, MCA, ME-Comp/IT, ME-Other, MSc-Comp/IT, MS-Comp/IT, MSc-Other, MTech-Comp/IT, MTech-Other Key Skills: Company Description Client Company Profile: Our client is a global information technology company founded in 1965 with its headquarters in Kista, Sweden . It provides real-time operating systems and consulting services. company, empowers mobile operators to manage and monetize encrypted traffic. Based on the industry s most scalable NFV platform, our solutions alleviate RAN congestion, create new revenue opportunities and unify data from visualized applications. The company provides solutions for mobile video traffic management, cloud data management and 5G Data Management. The company s global customer base consists of over 40 of the largest communication service providers including AT&T, Du, KDDI, Orange, Rogers, Sprint, Telus, T-Mobile, Telefonica, Telstra, Vodafone and Zain. They have office locations all over the world in Berlin, Russia, North Ireland etc. & in India at Pune & Hyderabad. Job Description MAJOR RESPONSIBILITIES Serve as the primary support contact and a technical support liaison to specified customers and monitor their email team-lists and new Support Service requests. Maintain information of allocated customers including contact points, deployment data, remote access method(s) and other information requested by management. Maintain administration of the allocated cases/tickets ensuring that case detail and status is accurate and up-to-date at all times. Where necessary, escalate the issues to other team members and manager or to other teams in accordance with relevant procedures. Serve as a technical expert within the team and assist and guide engineers in the execution of their duties and problem resolution. Install, Deploy and Test company`s software. May travel to customer sites to perform project or support work. Log issues (when a software bug is discovered) in the bug tracking system, reproduce the bug and provide all reasonable data, including the instructions on how the bug is reproduced, to the Product Group to assist them in resolving the issue. Create knowledge base documentation for all resolved issues. May serve as a technical support liaison and the primary support contact to specified partners. Work toward certification in one or more relevant non-company technology. Write tools and scripts to assist in trouble-shooting and support activities. Technically engage in and often lead the technical resolution of crisis management situations as requested by their manager and/or the Crisis Management team. Participate in an on-call rotation by being available by pager 12 hours per day, 7 days per week, including public holidays. Respond to pager alerts immediately and be no more than 15 minutes away from being able to actively engage, log any technical support issues raised in the call tracking system and begin resolution. Install, configure and test new patches and services on laboratory, pre-production and production environments creating all the needed documentation including the cutovers and traffic migrations at customer site. Participate in new Service Line rollouts and the implementation/deployment of new Mobility products Flexibility to work UK/US hours and on weekend. REQUIRED KNOWLEDGE, SKILLS, AND EXPERIENCE Preferable Telkom Support Domain experience. A Product Support experience is +. A scientific degree with at least 4-10 years of experience in the technical support arena in a software and/or Telco environment. Preferably in a multi-national company dealing with customers and colleagues around the world. Strong practical Unix/Linux operations, administration and troubleshooting skills. TCP/IP and knowledge of networking. UNIX scripting and maintaining Databases. Very good debugging in packet/network captures through wireshark, tcpdump, tshark etc Linux System Administration or DevOps experience with an emphasis on system and application Support, Deployment & automation Deep networking experience from OSI layer 2 to layer 7+ (TCP/UDP/IP, HTTP, HTTPS, load balancers, firewalls, routers, switches) Experience with SQL, LDAP or RDBMS Good Understanding around Monitoring & Reporting tools Grafana, Kibana, Pentaho, Splunk, Nagios etc etc Public/Private Clouds & Virtualization (AWS, Openstack, VMWare) Understanding of CI/CD, Kubernetes, Ansible, Genkins and Dev-Ops concepts. Proven skills in writing scripts (Shell, Perl, Python) to automate routine tasks or issue troubleshooting techniques and debugging skills to understanding existing one too. Proven technical expertise in Mobility, Internet and/or mobile technologies with strong problem solving skills and demonstrated ability to articulate and present technical solutions to address business problems. Strong interpersonal and communication skills, both written and verbal with the ability to develop and maintain strong working relationships at all levels both with the customer and internally. Demonstrated ability to work under pressure and manage critical situations and to influence without direct authority. Being politically astute with an understanding of commercial impact and principals. Being operationally driven with proven experience in a results-driven environment. Being customer focused and self-motivated and with strong teamwork skills and a flexible approach.

Posted 1 week ago

Apply

2.0 - 6.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of a Senior Software Engineer In this role, you will: Provide support across the end-to-end delivery and run lifecycle, utilising their skills and expertise to carry out software development, testing and operational support activities with the ability to move between these according to demand Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement Provide support in identification and resolution of all incidents associated with the IT service, as directed by leadership of the DevOps team. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Requirements To be successful in this role, you should meet the following requirements: Previous experienced working in Financial Services and/or Banking Operations, beneficial but not essential. University degree is required Languages: Good knowledge of Java based n-tier application (preferably with spring framework) Any RDBMS or NoSQL Databases. PostgreSQL experience will be added advantage Basic knowledge on Blockchain and Distributed ledger technologies and should be flexible to work in any new technology/ languages related to Blockchain Should have worked in following Blockchain technologies like Distributed Ledger Technologies (Ethereum, Corda, IBM Hyperledger Fabric) Exposure to Cryptography technologies Should have experience in JavaScript technologies (such as ReactJS/Angular) and HTML5 Cloud experience will be desirable Experience in CI/CD will be helpful tools like Jenkins, Ansible. Should have experience of working with Production live service Behavioural Detail-oriented individual with the passion to rapidly learn new skills, concepts and technologies. Strong problem solving skills, including providing simple solutions to complex situations. Strong communication/interpersonal skills, with an ability to relate concepts to non-technical colleagues. Willing to receive feedback on their work to help them progress and grow. Excellent motivational skills with the ability to adapt to change. Working with IT Ops/Support providing required support to Blockchain/ DLT systems post go-live

Posted 1 week ago

Apply

3.0 - 7.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

We are building up a new group within Anthology focused on the data platform. This team s mission is to bring data together from across Anthology s extensive product lines into our cloud-based data lake. We are the analytics and data experts at Anthology. Our team enables other development teams to utilize the data lake strategically and effectively for a variety of Anthology products. We deliver products and services for analytics, data science, business intelligence, and reporting. The successful candidate will have a strong foundation in software development, scaled infrastructure, containerization, pipeline development, and configuration management as well as strong problem-solving skills, analytical thinking skills, and strong written and oral communication skills . Primary responsibilities will include: Learning quickly and developing creative solutionsthatencompassperformance, reliability, maintainability,and security Applying hands-on implementation solutions using the AWS tool suite and other components to support Anthology products that utilize an expansive data lake Working with the development manager, product manager, and engineering team on projects related to system research, product design, product development, and defect resolution Being willing to respond to the unique challenges of delivering and maintaining cloud-based software. This includes minimizing downtime, troubleshooting live production environments, and responding to client-reported issues Working with other engineering personnel to ensure consistency among products Through continued iteration on existing development processes, ensuring that we re leading by example, fixing things that aren t working, and always improving our expectations of ourselves and others Thriving in the face of difficult problems Working independently with general supervision The Candidate: Required skills/qualifications: 2-4 yearsofexperience designing and developing enterprise solutions including serverless/functionless API services Knowledge of the OOP Experience with Python, Typescript/JavaScript Experience with SQL using Snowflake, Oracle, MSSQL, PostgreSQL, or other RDBMS Data structure algorithm analysis and design skills Knowledge of distributed systems and tradeoffs in consistency, availability, and network failure tolerance Knowledge of professional engineering best practices for the full SDLC, including coding standards, code reviews, source control management, build processes, testing, and operations Knowledge of a broader set of tools in the AWS tool suite (CDK, CloudFront, CloudWatch, CodeCommit, CodeBuild, CodePipeline, Lambda, API Gateway, SNS, SQS, S3, KMS, Batch, DynamoDB, DMS), Docker Fluency in written and spoken English Preferred skills/qualifications: Experience designing, developing, and operating scalable near real-time data pipelines and stream processing Experience with designing and implementing ETL processes Experience with fact/dimensional modeling (Kimball, Inmon) Previous experience in the education industry and e-learning technologies

Posted 1 week ago

Apply

8.0 - 12.0 years

20 - 27 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data platforms "KASHIF@D2NSOLUTIONS.COM"

Posted 1 week ago

Apply

6.0 - 11.0 years

0 - 2 Lacs

Kochi, Thiruvananthapuram

Work from Office

Naukri logo

Role & responsibilities: Lead and mentor a team of engineers, providing guidance on design, coding, and troubleshooting. Architect highly scalable and highly available systems with minimal guidance. Translate business requirements into scalable and extensible designs. Work closely with team members to deliver complete solutions to customers. Communicate effectively with clients to gather requirements, provide updates, and ensure their needs are met. Participate in all phases of the agile development cycle, including planning, design, implementation, review, testing, deployment, documentation, and training. Troubleshoot and resolve customer issues promptly and effectively. Advocate for and practice Agile methodologies within the team. Strive for continuous improvement in development processes and team productivity. Implement and maintain APIs using REST, GraphQL, and gRPC. Preferred candidate profile 6+ years of web development experience, with at least 2+ years of proven proficiency in Go programming. Deep understanding of RDBMS and NoSQL databases. Experience implementing AWS services, including containerization to support Go applications and repository maintenance. Proficiency with Kubernetes for application development, deployment, and scaling. Strong knowledge of RESTful APIs, GraphQL, and an understanding of the differences between REST, GraphQL, and gRPC. Knowledge of additional web technologies such as Python or PHP is a plus. Strong time management and organizational skills. Knowledge of Go templating, common frameworks, and tools. Ability to instill and enforce best practices in web development within the team. Strong design skills, with an understanding of design and architectural patterns. Excellent understanding of data structures and algorithms. A sense of urgency and ownership over the product, with a proactive approach to work. Great attitude towards work and people, fostering a positive team environment. Excellent analytical and problem-solving skills. Ability to thrive in a fast-paced service environment. Experience in technical mentorship or coaching is highly desirable. Excellent written and verbal communication skills. Experience working in an Agile development environment is a plus. Knowledge of front-end development technologies such as HTML, CSS, and JavaScript.

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 11 Lacs

Pune

Work from Office

Naukri logo

Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune. Qualifications: Bachelors degree in computer science, Information Technology, Engineering, or related field, or equivalent work experience. English fluent (B2 C1) Job Description Missions: Perimeter: All Business application Worldwide in production. Resolve incident in a short time to give to the Business the best Quality. To work transversely with other service lines and business entities to meet the key performance indicators. Continuous improvement by participating to Problem management. The main activities are: Contribute to incident resolution, service request completion and Change implementation Participate to patching activities Shift Structure: The support team operates 24/7, with shifts organized to ensure continuous coverage. Shifts are typically 8 hours long, with rotations to cover day, evening, and night shifts. Flexibility to work weekends and holidays as part of the shift rotation is required. Technical Skills: Operating Systems: Proficiency in Windows and Linux Hardware Knowledge: Familiarity with servers, storage devices, and other hardware components. Troubleshooting: Ability to diagnose and resolve software, Operating Systems issues. Scripting and Automation: Skills in scripting languages like Python, PowerShell, or Bash to automate tasks Virtualization: Basic knowledge of VMware, Hyper-V, or other virtualization technologies Database Management: Basic knowledge of SQL and database management systems Interpersonal Skills: Customer Service: Strong communication skills to explain technical issues to non-technical users Problem-Solving: Analytical skills to diagnose and resolve issues efficiently Time Management: Ability to prioritize tasks and manage time effectively Team Collaboration: Working well with other team members and departments Adaptability: Willingness to learn new technologies and adapt to changing environments Additional Skills: Technical Documentation: Writing clear and concise documentation for troubleshooting and procedures Security Awareness: Understanding of basic cybersecurity principles to protect systems and data Benefits: Career development and training opportunities. Friendly, collaborative work environment with opportunities to make an impact. KEY EXPECTED ACHIEVEMENTS: Provide 1st and 2nd-level support for applications and middlewares to ensure smooth business operations. Respond promptly to service requests and incidents, providing detailed solutions or escalations as needed. Change Management: Participation in the change management process Problem Management: Participation in the analysis of root causes of incidents Monitor application performance and perform necessary maintenance and upgrades. Create, maintain, and manage knowledge base articles and documentation for internal and end-user support. Collaborate with cross-functional teams to improve application functionality and efficiency. Perform system diagnostics, software configuration, and basic database queries to resolve issues. Monitor and report on application metrics, including uptime, performance, and user satisfaction. Interested candidates for above position kindly share your CVs on pranita.th@peoplefy.com with below details - Experience : CTC : Expected CTC : Notice Period : Location :

Posted 1 week ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education RoleTechnology Support Job TitleSenior DataStage Consultant Career Level08 Must have skills:Apache, Tomcat, IIS, IBM WebSphere Administration, IBM DataStage, Linux shell scripting Good to have skills:Teradata, RDBMS and SQL experience in Oracle, DB2 About The Role : Lead the effort to design, build and configure applications, acting as the primary point of contact.Job Summary The ETL Developer is responsible for L2-L3 production support , administration of DataStage Application, and designing, building, deploying and maintaining ETL DataStage interfaces using the IBM InfoSphere DataStage ETL development tool.Key Responsibilities1. Extend Production Support in L2 / L3 capacity. Being able to efficiently debug and troubleshoot production issues2. Evaluate existing data solutions, write scalable ETLs, develop documentation, and train/help team members.3. Collaborate and work with business/development teams and infrastructure teams on L3 issues and follow the task to completion.4. Participate & provide support for releases, risks, mitigation plan, and regular DR exercise for project roll out.5. Drive Automation, Permanent Fixes to prevent issues from reoccurring.6. Manage Service Level Agreements7. Bring continuous improvements to reduce time to resolve for production incidents8. Perform root cause analysis and identify and implement corrective and preventive measures9. Document standards, processes and procedures relating to best practices, issues and resolutions10. Constantly upskill with tools & technologies to meet organization's future needs11. Be available on call (on rotation) in a support role12. Effectively manage multiple, competing prioritiesTechnical Responsibilities: Excellent Understanding of technical concepts. Strong understanding of OS related dependencies Strong exposure to the shell scripting. Having expertise in any Cloud and Middleware technologies would be a great value add on Professional Attributes: Good verbal and written communication skills to connect with customers at varying levels of the organization Ability to operate independently and make decisions with little direct supervision Candidate must be willing to cross skill and up skill based on project and business requirements.Education Qualification: Higher Level Qualification in a technical subject is desirable IBM DataStage certification. Additional Information:A:Strong written & oral communication skills.B:Should be open to work in shifts. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Spring Boot Good to have skills : Spring Application Framework Minimum 5 year(s) of experience is required Educational Qualification : 15 yrs of mandatory education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Spring BootGood to Have Skills : Spring Application FrameworkJob Requirements :Key Responsibilities :a:Responsible for development and testing b:Should have in depth knowledge in complex systems integrating environment c:Should be able to implement new designs patterns d:Should be able to coordinate with multiple teams of Accenture and client Technical Experience :a:Minimum 5 years of experience in design, develop, code, test in Java technologies with good communication skills b:Experience in Spring, Hibernate c:Well versed with OOPS concepts d:Knowledge of SQL, RDBMS Professional Attributes :A:Resources should have good communication skills B:Resource should have good analytical skills Educational Qualification:Additional Info :Location - Mumbai Resources are required to travel to office 2 days a week Qualification 15 yrs of mandatory education

Posted 1 week ago

Apply

6.0 - 10.0 years

35 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Job Overview : We are seeking a highly skilled Senior Platform Engineer with a robust background in Python programming and extensive experience with AWS services. With at least 6 years of relevant experience, the ideal candidate will be an expert in serverless development and event-driven architecture design. This position is geared towards a proactive and passionate engineer eager to take ownership of modules within our cloud management platform, contributing significantly to its scalability, efficiency, and innovation. You'll have the opportunity to work on cutting-edge technology, shaping the future of our cloud management platform with your expertise. If you're passionate about building scalable, efficient, and innovative cloud solutions, we'd love to have you on our team. Responsibilities : -Take full ownership of developing, maintaining, and enhancing specific modules of our cloud management platform, ensuring they meet our standards for scalability, efficiency, and reliability. -Design and implement serverless applications and event-driven systems that integrate seamlessly with AWS services, driving the platform's innovation forward. -Work closely with cross-functional teams to conceptualize, design, and implement advanced features and functionalities that align with our business goals. -Utilize your deep expertise in cloud architecture and software development to provide technical guidance and best practices to the engineering team, enhancing the platform's capabilities. -Stay ahead of the curve by researching and applying the latest trends and technologies in the cloud industry, incorporating these insights into the development of our platform. -Solve complex technical issues, providing advanced support and guidance to both internal teams and external stakeholders. Requirements: -A minimum of 6 years of relevant experience in platform or application development, with a strong emphasis on Python and AWS cloud services. -Proven expertise in serverless development and event-driven architecture design, with a track record of developing and shipping high-quality SaaS platforms on AWS. -Comprehensive understanding of cloud computing concepts, architectural best practices, and AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. -Solid knowledge of Object-Oriented Programming (OOP), SOLID principles, and experience with relational and NoSQL databases. -Proficiency in developing and integrating RESTful APIs and familiarity with source control systems like Git. -Exceptional problem-solving skills, capable of optimizing complex systems. -Excellent communication skills, capable of effectively collaborating with team members and engaging with stakeholders. -A strong drive for continuous learning and staying updated with industry developments. Nice to Have : -AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. -Experience with the AWS Boto3 SDK for Python. -Exposure to other cloud platforms such as Azure or GCP. -Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes.

Posted 1 week ago

Apply

2.0 - 4.0 years

10 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Design, develop, and maintain various components of our cloud management platform, ensuring high performance and responsiveness. Work in collaboration with cross-functional teams to conceptualize, design, and deploy innovative features and functionalities that meet our business needs. Offer technical support and guidance to internal teams and stakeholders, helping to resolve complex issues. Keep abreast of the latest trends and technologies in the industry to incorporate best practices into our platform. Requirements 1.5 to 2 years of professional experience in developing applications or platforms using Python. Strong understanding of Object-Oriented Programming (OOP), SOLID principles, and Relational Database Management Systems (RDBMS). Proven experience with AWS services, such as Lambda, RDS, and DynamoDB, with a strong grasp of cloud computing concepts and architectural best practices. Experience in developing and integrating RESTful APIs. Experience with source control systems, such as Git. Exceptional problem-solving abilities, with a knack for debugging complex systems. Excellent communication skills, capable of effectively collaborating with team members and engaging with stakeholders. A relentless drive for learning and staying current with industry developments. Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent professional experience. Nice to have AWS Certified Developer Associate or other relevant AWS certifications. Experience in serverless development is a significant plus, showcasing familiarity with building and deploying serverless applications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms like Azure or GCP. Familiarity with containerization and orchestration technologies, such as Docker and Kubernetes.

Posted 1 week ago

Apply

5.0 - 9.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Backend Engineer (Java) Responsibilities Designing and implementing software using Java Ensuring the code quality by implementing unit, integration and end-to-end tests Optimising application for maximum performance Working with DevOps related activities (CI/CD, infrastructure, etc.) Working in a distributed team and cooperate with other teams on cross-team deliveries Troubleshooting, analysing, and solving integration and production issues Skills 6+ years of professional Java software development experience Strong knowledge of Java. Strong system design skills and programming skills Experience with Spring Framework, Spring Boot, REST, CI and Kanban Familiarity with common algorithms, data structures and multithreading Familiarity with Git/Gradle, Docker, Kubernetes, Continuous Delivery and DevOps Experience with RDBMS (MySQL, etc.) and NoSQL (Apache Cassandra, etc.) databases Comfortable with making technical and architectural decisions autonomously Communicative, able to explain concepts well to both tech and non tech Java 11 or above version experience is a must

Posted 1 week ago

Apply

5.0 - 10.0 years

35 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Job Overview : We are seeking a highly skilled Senior Platform Engineer with a robust background in Python programming and extensive experience with AWS services. With at least 6 years of relevant experience, the ideal candidate will be an expert in serverless development and event-driven architecture design. This position is geared towards a proactive and passionate engineer eager to take ownership of modules within our cloud management platform, contributing significantly to its scalability, efficiency, and innovation. You'll have the opportunity to work on cutting-edge technology, shaping the future of our cloud management platform with your expertise. If you're passionate about building scalable, efficient, and innovative cloud solutions, we'd love to have you on our team. Responsibilities : -Take full ownership of developing, maintaining, and enhancing specific modules of our cloud management platform, ensuring they meet our standards for scalability, efficiency, and reliability. -Design and implement serverless applications and event-driven systems that integrate seamlessly with AWS services, driving the platform's innovation forward. -Work closely with cross-functional teams to conceptualize, design, and implement advanced features and functionalities that align with our business goals. -Utilize your deep expertise in cloud architecture and software development to provide technical guidance and best practices to the engineering team, enhancing the platform's capabilities. -Stay ahead of the curve by researching and applying the latest trends and technologies in the cloud industry, incorporating these insights into the development of our platform. -Solve complex technical issues, providing advanced support and guidance to both internal teams and external stakeholders. Requirements: -A minimum of 6 years of relevant experience in platform or application development, with a strong emphasis on Python and AWS cloud services. -Proven expertise in serverless development and event-driven architecture design, with a track record of developing and shipping high-quality SaaS platforms on AWS. -Comprehensive understanding of cloud computing concepts, architectural best practices, and AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. -Solid knowledge of Object-Oriented Programming (OOP), SOLID principles, and experience with relational and NoSQL databases. -Proficiency in developing and integrating RESTful APIs and familiarity with source control systems like Git. -Exceptional problem-solving skills, capable of optimizing complex systems. -Excellent communication skills, capable of effectively collaborating with team members and engaging with stakeholders. -A strong drive for continuous learning and staying updated with industry developments. Nice to Have : -AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. -Experience with the AWS Boto3 SDK for Python. -Exposure to other cloud platforms such as Azure or GCP. -Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes.

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 16 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking an experienced Database Administrator (DBA) to manage and maintain our MongoDB and PostgreSQL databases. The ideal candidate will have expertise in both NoSQL and relational databases, with a strong focus on performance, security, and data integrity. Key Responsibilities MongoDB Responsibilities 1. Database Design and Architecture: Design and implement MongoDB databases, including data modeling, schema design, and indexing. 2. Performance Tuning: Optimize MongoDB performance, including query optimization, indexing, and caching. 3. Data Migration and Integration: Develop and implement data migration and integration strategies for MongoDB. 4. Security and Backup: Ensure MongoDB security and backup, including data encryption, access control, and disaster recovery. PostgreSQL Responsibilities 1. Database Design and Architecture: Design and implement PostgreSQL databases, including data modeling, schema design, and indexing. 2. Performance Tuning: Optimize PostgreSQL performance, including query optimization, indexing, and caching. 3. Data Migration and Integration: Develop and implement data migration and integration strategies for PostgreSQL. 4. Security and Backup: Ensure PostgreSQL security and backup, including data encryption, access control, and disaster recovery. Shared Responsibilities 1. Database Monitoring and Troubleshooting: Monitor database performance, troubleshoot issues, and resolve problems. 2. Collaboration and Communication: Collaborate with development teams, provide database support, and communicate technical information to non-technical stakeholders. 3. Documentation and Knowledge Sharing: Document database designs, configurations, and procedures, and share knowledge with other teams. Requirements 1. Experience: 5+ years of experience as a DBA, with expertise in both MongoDB and PostgreSQL. 2. Technical Skills: Strong understanding of database design, architecture, and performance tuning for both MongoDB and PostgreSQL. 3. Operating Systems: Experience with Linux and/or Windows operating systems. 4. Scripting Languages: Familiarity with scripting languages, such as Python, Bash, or Perl. 5. Certifications: MongoDB and/or PostgreSQL certifications are a plus. Nice to Have 1. Cloud Experience: Experience with cloud-based databases, such as MongoDB Atlas or Amazon RDS. 2. DevOps Tools: Familiarity with DevOps tools, such as Docker, Kubernetes, or Ansible. 3. Agile Methodologies: Experience with Agile development methodologies.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

PTC is seeking a highly experienced Site Reliability Engineer (SRE) with a strong background in DevOps architecture, secure software integration, and SaaS platform delivery. This role is ideal for someone passionate about embedding security early in the software supply chain from build to deployment while enhancing agility and reliability. As part of a global, cross-functional team, you will contribute to the design and implementation of secure, high-performance, and scalable delivery pipelines for PTC s flagship ALM suite, Codebeamer , hosted on a SaaS platform. Key Responsibilities Collaborate with global teams to implement containerized DevOps infrastructure and cloud-native solutions. Build and maintain automation for deployment, monitoring, reporting, and analysis. Manage CI/CD pipelines to optimize efficiency and reliability. Apply industry best practices for system hardening and configuration management. Secure, scale, and manage Linux-based virtual environments. Develop and maintain solutions for system administration, backups, disaster recovery, and performance/security monitoring. Design and implement secure automation for development, testing, and production environments. Continuously evaluate and improve existing systems, ensuring compliance with industry standards and best practices. Promote knowledge sharing across cross-functional teams, including IT and Engineering. Communicate clearly about technical decisions, trade-offs, and their impact on the broader system. Required Skills & Experience Minimum 7 years of hands-on experience in DevOps and SRE roles. Proven expertise in continuous delivery and deployment of SaaS products. Deep understanding of cloud-native technologies, especially on Azure and AWS , including IaaS, PaaS, backup, rollback, and HA/DR strategies. Advanced knowledge of Containers and Kubernetes (both managed and OS-level), including ingress controllers like Emissary and nginx . Proficiency in scripting languages such as Python , Shell , and Groovy ; familiarity with YAML . Experience with PostgreSQL or other RDBMS. Strong version control skills using GitHub or similar SCCM tools. Hands-on experience with Jenkins for CI/CD pipeline management. Familiarity with configuration and deployment tools like Ansible , Helm , Helmfile/Helmsman , Terraform , and FluxCD . Experience with monitoring and logging tools such as Prometheus , Grafana , and Azure Monitoring . Preferred Qualifications CKA (Certified Kubernetes Administrator) certification. Solid understanding of Linux system administration. Familiarity with Agile methodologies, frameworks, and metrics. Experience working with large, interoperable product suites. What We Value A proactive, solutions-oriented mindset with a strong focus on automation and security. A collaborative approach to problem-solving and a commitment to continuous learning. Strong analytical and decision-making skills with the ability to drive initiatives to completion. Excellent communication and interpersonal skills, with the ability to work effectively across diverse, multi-site teams.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Our Data Engineers play a crucial role in designing and operationalizing transformational enterprise data solutions on Cloud Platforms, integrating Azure services, Snowflake technology, and other third-party data technologies. Cloud Data Engineers will work closely with a multidisciplinary agile team to build high-quality data pipelines that drive analytic solutions. These solutions will generate insights from our connected data, enabling Kimberly-Clark to advance its data-driven decision-making capabilities. The ideal candidate will have a deep understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows. They should be skilled in creating data products that support analytic solutions and possess proficiency in working with APIs and understanding data structures to serve them. Experience in using ADF (Azure Data Factory) for orchestrating and automating data movement and transformation . Additionally, expertise in data visualization tools, specifically PowerBI, is required. The candidate should have strong problem-solving skills, be able to work as part of a technical, cross-functional analytics team, and be an agile learner with a passion for solving complex data problems and delivering insights. If you are an agile learner, possess strong problem-solving skills, can work as part of a technical, cross-functional analytics team, and want to solve complex data problems while delivering insights that help enable our analytics strategy, we would like to hear from you. This role is perfect for a developer passionate about leveraging cutting-edge technologies to create impactful digital products that connect with and serve our clients effectively. Kimberly-Clark has an amazing opportunity to continue leading the market, and DTS is poised to deliver compelling and robust digital capabilities, products, and solutions to support it. This role will have substantial influence in this endeavor. Scope/Categories: Role will report to the Data Analytics Engineer Manager and Product Owner. Key Responsibilities: Design and operationalize enterprise data solutions on Cloud Platforms : Develop and implement scalable and secure data solutions on cloud platforms, ensuring they meet enterprise standards and requirements. This includes designing data architecture, selecting appropriate cloud services, and optimizing performance for data processing and storage. Integrate Azure services, Snowflake technology, and other third-party data technologies: Seamlessly integrate various data technologies, including Azure services, Snowflake, and other third-party tools, to create a cohesive data ecosystem. This involves configuring data connectors, ensuring data flow consistency, and managing dependencies between different systems. Build and maintain high-quality data pipelines for analytic solutions: Develop robust data pipelines that automate the extraction, transformation, and loading (ETL) of data from various sources into a centralized data warehouse or lake. Ensure these pipelines are efficient, reliable, and capable of handling large volumes of data. Collaborate with a multidisciplinary agile team to generate insights from connected data Work closely with data scientists, analysts, and other team members in an agile environment to translate business requirements into technical solutions. Participate in sprint planning, stand-ups, and retrospectives to ensure timely delivery of data products. Manage and create data inventories for analytics and APIs to be consumed : Develop and maintain comprehensive data inventories that catalog available data assets and their metadata. Ensure these inventories are accessible and usable by various stakeholders, including through APIs that facilitate data consumption. Design data integrations with internal and external products : Architect and implement data integration solutions that enable seamless data exchange between internal systems and external partners or products. This includes ensuring data integrity, security, and compliance with relevant standards. Build data visualizations to support analytic insights : Create intuitive and insightful data visualizations using tools like PowerBI, incorporating semantic layers to provide a unified view of data and help stakeholders understand complex data sets and derive actionable insights. Required Skills and Experience: Proficiency with Snowflake Ecosystem : Demonstrated ability to use Snowflake for data warehousing, including data ingestion, transformation, and querying. Proficiency in using Snowflakes features for scalable data processing, including the use of Snowpipe for continuous data ingestion and Snowflakes SQL capabilities for data transformation. Ability to optimize Snowflake performance through clustering, partitioning, and other best practices. Azure Data Factory (ADF): Experience in using ADF for orchestrating and automating data movement and transformation within the Azure ecosystem. Proficiency in programming languages such as SQL, NoSQL, Python, Java, R, and Scala: Strong coding skills in multiple programming languages used for data manipulation, analysis, and pipeline development. Experience with ETL (extract, transform, and load) systems and API integrations: Expertise in building and maintaining ETL processes to consolidate data from various sources into centralized repositories, and integrating APIs for seamless data exchange. Understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows : You should have a comprehensive knowledge of designing and implementing data systems that support various analytic and operational use cases, including data storage, processing, and retrieval. Basic understanding of machine learning concepts to support data scientists on the team: Familiarity with key machine learning principles and techniques to better collaborate with data scientists and support their analytical models. Strong problem-solving skills and ability to work as part of a technical, cross-functional analytics team : Excellent analytical and troubleshooting abilities, with the capability to collaborate effectively with team members from various technical and business domains. Skilled in creating data products that support analytic solutions: Proficiency in developing data products that enable stakeholders to derive meaningful insights and make data-driven decisions. This involves creating datasets, data models, and data services tailored to specific business needs. Experience in working with APIs and understanding data structures to serve them: Experience in designing, developing, and consuming APIs for data access and integration. This includes understanding various data structures and formats used in API communication. Knowledge of managing sensitive data, ensuring data privacy and security: Expertise in handling sensitive data with strict adherence to data privacy regulations and security best practices to protect against unauthorized access and breaches. Agile learner with a passion for solving complex data problems and delivering insights: A proactive and continuous learner with enthusiasm for addressing challenging data issues and providing valuable insights through innovative solutions. Experience with CPG Companies and POS Data: Experience in analyzing and interpreting POS data to provide actionable insights for CPG companies, enhancing their understanding of consumer behavior and optimizing sales strategies. Knowledge and Experience Bachelor s degree in management information systems/technology, Computer Science, Engineering, or related discipline. MBA or equivalent is preferred. 7+ years of experience in designing large-scale data solutions, performing design assessments, crafting design options and analysis, finalizing preferred solution choice working with IT and Business stakeholders. 5+ years of experience tailoring, configuring, and crafting solutions within the Snowflake environment, including a profound grasp of Snowflakes data warehousing capabilities, data architecture, SQL optimization for Snowflake, and leveraging Snowflakes unique features such as Snowpipe, Streams, and Tasks for real-time data processing and analytics. A strong foundation in data migration strategies, performance tuning, and securing data within the Snowflake ecosystem is essential. 3+ years demonstrated expertise in architecting solutions within the Snowflake ecosystem, adhering to best practices in data architecture and design patterns. 7+ years of data engineering or design experience, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting. Expertise in data modeling principles/methods including, Conceptual, Logical Physical Data Models for data warehouses, data lakes and/or database management systems. 5+ years of hands-on experience designing, building, and operationalizing data solutions and applications using cloud data and analytics services in combination with 3rd parties. 7+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). 7+ years of experience with database development and scripting. Professional Skills: Strong communication and interpersonal skills. Strong analytical and problem-solving skills and passion for product development. Strong understanding of Agile methodologies and open to working in agile environments with multiple stakeholders. Professional attitude and service orientation; team player. Ability to translate business needs into potential analytics solutions. Strong work ethic, ability to work at an abstract level and gain consensus. Ability to build a sense of trust and rapport to create a comfortable and effective workplace. Self-starter who can see the big picture, prioritize work to make the largest impact on the business and customers vision and requirements. Fluency in English.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Expertise in Javascript, , NodeJS, React.JS, Angular JS and Jest 5+ years of experience in front-end and Backend web development using JS framework (React.JS, Node.js, Angular) 2 to 3 years of experience with a JS unit testing library (Jest, Mocha) 2 to 3 years of experience in database technology preferably RDBMS/ ORDBMS 2+ years of experience with Github and advanced Github features Experience working with large microservice-based architectures

Posted 1 week ago

Apply

7.0 - 12.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

you're driven to perform at the highest level possible, and you appreciate a performance culture fueled by authentic caring. You want to be part of a company actively dedicated to sustainability, inclusion, we'llbeing, and career development. You love what you do, especially when the work you do makes a difference. At Kimberly-Clark, we're constantly exploring new ideas on how, when, and where we can best achieve results. When you join our team, you'll experience Flex That Works: flexible (hybrid) work arrangements that empower you to have purposeful time in the office and partner with your leader to make flexibility work for both you and the business. Our Data Engineers play a crucial role in designing and operationalizing transformational enterprise data solutions on Cloud Platforms, integrating Azure services, Snowflake technology, and other third-party data technologies. Cloud Data Engineers will work closely with a multidisciplinary agile team to build high-quality data pipelines that drive analytic solutions. These solutions will generate insights from our connected data, enabling Kimberly-Clark to advance its data-driven decision-making capabilities. The ideal candidate will have a deep understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows. They should be skilled in creating data products that support analytic solutions and possess proficiency in working with APIs and understanding data structures to serve them. Experience in using ADF (Azure Data Factory) for orchestrating and automating data movement and transformation . Additionally, expertise in data visualization tools, specifically PowerBI, is required. The candidate should have strong problem-solving skills, be able to work as part of a technical, cross-functional analytics team, and be an agile learner with a passion for solving complex data problems and delivering insights. If you are an agile learner, possess strong problem-solving skills, can work as part of a technical, cross-functional analytics team, and want to solve complex data problems while delivering insights that help enable our analytics strategy, we would like to hear from you. This role is perfect for a developer passionate about leveraging cutting-edge technologies to create impactful digital products that connect with and serve our clients effectively. Kimberly-Clark has an amazing opportunity to continue leading the market, and DTS is poised to deliver compelling and robust digital capabilities, products, and solutions to support it. This role will have substantial influence in this endeavor. If you are excited to make a difference applying cutting-edge technologies to solve real business challenges and add value to a global, market-leading organization, please come join us! Scope/Categories: Role will report to the Data Analytics Engineer Manager and Product Owner. Key Responsibilities: Design and operationalize enterprise data solutions on Cloud Platforms : Develop and implement scalable and secure data solutions on cloud platforms, ensuring they meet enterprise standards and requirements. This includes designing data architecture, selecting appropriate cloud services, and optimizing performance for data processing and storage. Integrate Azure services, Snowflake technology, and other third-party data technologies: Seamlessly integrate various data technologies, including Azure services, Snowflake, and other third-party tools, to create a cohesive data ecosystem. This involves configuring data connectors, ensuring data flow consistency, and managing dependencies between different systems. Build and maintain high-quality data pipelines for analytic solutions: Develop robust data pipelines that automate the extraction, transformation, and loading (ETL) of data from various sources into a centralized data warehouse or lake. Ensure these pipelines are efficient, reliable, and capable of handling large volumes of data. Collaborate with a multidisciplinary agile team to generate insights from connected data Work closely with data scientists, analysts, and other team members in an agile environment to translate business requirements into technical solutions. Participate in sprint planning, stand-ups, and retrospectives to ensure timely delivery of data products. Manage and create data inventories for analytics and APIs to be consumed : Develop and maintain comprehensive data inventories that catalog available data assets and their metadata. Ensure these inventories are accessible and usable by various stakeholders, including through APIs that facilitate data consumption. Design data integrations with internal and external products : Architect and implement data integration solutions that enable seamless data exchange between internal systems and external partners or products. This includes ensuring data integrity, security, and compliance with relevant standards. Build data visualizations to support analytic insights : Create intuitive and insightful data visualizations using tools like PowerBI, incorporating semantic layers to provide a unified view of data and help stakeholders understand complex data sets and derive actionable insights. Required Skills and Experience: Proficiency with Snowflake Ecosystem : Demonstrated ability to use Snowflake for data warehousing, including data ingestion, transformation, and querying. Proficiency in using Snowflakes features for scalable data processing, including the use of Snowpipe for continuous data ingestion and Snowflakes SQL capabilities for data transformation. Ability to optimize Snowflake performance through clustering, partitioning, and other best practices. Azure Data Factory (ADF): Experience in using ADF for orchestrating and automating data movement and transformation within the Azure ecosystem. Proficiency in programming languages such as SQL, NoSQL, Python, Java, R, and Scala: Strong coding skills in multiple programming languages used for data manipulation, analysis, and pipeline development. Experience with ETL (extract, transform, and load) systems and API integrations: Expertise in building and maintaining ETL processes to consolidate data from various sources into centralized repositories, and integrating APIs for seamless data exchange. Understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows : You should have a comprehensive knowledge of designing and implementing data systems that support various analytic and operational use cases, including data storage, processing, and retrieval. Basic understanding of machine learning concepts to support data scientists on the team: Familiarity with key machine learning principles and techniques to better collaborate with data scientists and support their analytical models. Strong problem-solving skills and ability to work as part of a technical, cross-functional analytics team : Excellent analytical and troubleshooting abilities, with the capability to collaborate effectively with team members from various technical and business domains. Skilled in creating data products that support analytic solutions: Proficiency in developing data products that enable stakeholders to derive meaningful insights and make data-driven decisions. This involves creating datasets, data models, and data services tailored to specific business needs. Experience in working with APIs and understanding data structures to serve them: Experience in designing, developing, and consuming APIs for data access and integration. This includes understanding various data structures and formats used in API communication. Knowledge of managing sensitive data, ensuring data privacy and security: Expertise in handling sensitive data with strict adherence to data privacy regulations and security best practices to protect against unauthorized access and breaches. Agile learner with a passion for solving complex data problems and delivering insights: A proactive and continuous learner with enthusiasm for addressing challenging data issues and providing valuable insights through innovative solutions. Experience with CPG Companies and POS Data: Experience in analyzing and interpreting POS data to provide actionable insights for CPG companies, enhancing their understanding of consumer behavior and optimizing sales strategies. Knowledge and Experience Bachelor s degree in management information systems/technology, Computer Science, Engineering, or related discipline. MBA or equivalent is preferred. 7+ years of experience in designing large-scale data solutions, performing design assessments, crafting design options and analysis, finalizing preferred solution choice working with IT and Business stakeholders. 5+ years of experience tailoring, configuring, and crafting solutions within the Snowflake environment, including a profound grasp of Snowflakes data warehousing capabilities, data architecture, SQL optimization for Snowflake, and leveraging Snowflakes unique features such as Snowpipe, Streams, and Tasks for real-time data processing and analytics. A strong foundation in data migration strategies, performance tuning, and securing data within the Snowflake ecosystem is essential. 3+ years demonstrated expertise in architecting solutions within the Snowflake ecosystem, adhering to best practices in data architecture and design patterns. 7+ years of data engineering or design experience, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting. Expertise in data modeling principles/methods including, Conceptual, Logical Physical Data Models for data warehouses, data lakes and/or database management systems. 5+ years of hands-on experience designing, building, and operationalizing data solutions and applications using cloud data and analytics services in combination with 3rd parties. 7+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). 7+ years of experience with database development and scripting. Professional Skills: Strong communication and interpersonal skills. Strong analytical and problem-solving skills and passion for product development. Strong understanding of Agile methodologies and open to working in agile environments with multiple stakeholders. Professional attitude and service orientation; team player. Ability to translate business needs into potential analytics solutions. Strong work ethic, ability to work at an abstract level and gain consensus. Ability to build a sense of trust and rapport to create a comfortable and effective workplace. Self-starter who can see the big picture, prioritize work to make the largest impact on the business and customers vision and requirements. Fluency in English.

Posted 1 week ago

Apply

1.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Design and develop highly available, reliable, secure and fault-tolerant systems. Able to take responsibility for multiple services maintained by the team. Own and improve product reliability by collaborating with Tech stakeholders. Contribute to engineering efforts from planning to execution and delivery to solve complex engineering problems. Manage individual priorities, deadlines, and deliverables. Provide Level 2 technical support to address incidents escalations. Requirements: Proven ability to design, develop and manage microservices at scale. Proven ability to handle incidents escalations of complex technical issues. Willingness to cross-team/role boundaries and work with other teams/other roles. Willingness to learn and experiment with new languages and technologies. Experience: 1+ year of overall work experience. Previous experience working in Enterprise SaaS is a plus. Qualifications Minimum qualification: Bachelor s degree in Engineering. Technical Understanding of Computer Science fundamentals, using them to effectively design and develop microservices at scale. Hands-on experience developing microservices in Python / Golang is preferrable. Hands-on experience working with RDBMS like PostgreSQL. Experience working with gRPC is a plus. Experience working with container platforms like Docker, Kubernetes is a plus

Posted 1 week ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Prior experience in application development, with at least 3 years of professional focus on Microsoft Power Apps Proficiency in Power Apps development, including Canvas Apps, Model-Driven Apps, and Power Automate Ability to create custom component in PowerApps Ability to create business and IT processes with out-of-the-box and custom connectors with Microsoft Automate/Flow Ability to create and connect child flows Ability to leverage the use of Microsoft Power Automate or Azure Logic Apps A clear understanding of Power Platform functions and limitations Collaborate with stakeholders to gather requirements and provide technical solutions Stay up to date with the latest Power Apps and Power Automate features and trends Strong problem-solving and analytical abilities Provide technical insights and guidance to development teams throughout the software development lifecycle Knowledge of best practices for app performance optimization and security within the Power Apps platform Exceptional verbal and written communication Technical Skills and Competence Power Apps Power Flow/ Automate Power BI SharePoint Dataverse REST API Azure DevOps Qualifications Experience 2-4 years of professional experience in Power Apps and Power Flow/Automate development Background in integrating Power Apps with other Microsoft services (eg, SharePoint, Dynamics 365, Dataverse) Experience with Azure services and cloud-based solutions Knowledge of user experience (UX) design principles to enhance application usability Proven expertise in RDBMS databases with hands-on query optimization experience Should have basic knowledge of .Net, .Net Core, SQL Server Education Bachelor s or master s degree in computer science, Information Technology, or a related field Relevant certifications (such as Microsoft Certified: Power Apps Developer Associate or Power Platform Developer Associate)

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Our Data Engineers play a crucial role in designing and operationalizing transformational enterprise data solutions on Cloud Platforms, integrating Azure services, Snowflake technology, and other third-party data technologies. Cloud Data Engineers will work closely with a multidisciplinary agile team to build high-quality data pipelines that drive analytic solutions. These solutions will generate insights from our connected data, enabling Kimberly-Clark to advance its data-driven decision-making capabilities. The ideal candidate will have a deep understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows. They should be skilled in creating data products that support analytic solutions and possess proficiency in working with APIs and understanding data structures to serve them. Experience in using ADF (Azure Data Factory) for orchestrating and automating data movement and transformation . Additionally, expertise in data visualization tools, specifically PowerBI, is required. The candidate should have strong problem-solving skills, be able to work as part of a technical, cross-functional analytics team, and be an agile learner with a passion for solving complex data problems and delivering insights. If you are an agile learner, possess strong problem-solving skills, can work as part of a technical, cross-functional analytics team, and want to solve complex data problems while delivering insights that help enable our analytics strategy, we would like to hear from you. This role is perfect for a developer passionate about leveraging cutting-edge technologies to create impactful digital products that connect with and serve our clients effectively. Kimberly-Clark has an amazing opportunity to continue leading the market, and DTS is poised to deliver compelling and robust digital capabilities, products, and solutions to support it. This role will have substantial influence in this endeavor. If you are excited to make a difference applying cutting-edge technologies to solve real business challenges and add value to a global, market-leading organization, please come join us Scope/Categories: Role will report to the Data Analytics Engineer Manager and Product Owner. Key Responsibilities: Design and operationalize enterprise data solutions on Cloud Platforms : Develop and implement scalable and secure data solutions on cloud platforms, ensuring they meet enterprise standards and requirements. This includes designing data architecture, selecting appropriate cloud services, and optimizing performance for data processing and storage. Integrate Azure services, Snowflake technology, and other third-party data technologies: Seamlessly integrate various data technologies, including Azure services, Snowflake, and other third-party tools, to create a cohesive data ecosystem. This involves configuring data connectors, ensuring data flow consistency, and managing dependencies between different systems. Build and maintain high-quality data pipelines for analytic solutions: Develop robust data pipelines that automate the extraction, transformation, and loading (ETL) of data from various sources into a centralized data warehouse or lake. Ensure these pipelines are efficient, reliable, and capable of handling large volumes of data. Collaborate with a multidisciplinary agile team to generate insights from connected data Work closely with data scientists, analysts, and other team members in an agile environment to translate business requirements into technical solutions. Participate in sprint planning, stand-ups, and retrospectives to ensure timely delivery of data products. Manage and create data inventories for analytics and APIs to be consumed : Develop and maintain comprehensive data inventories that catalog available data assets and their metadata. Ensure these inventories are accessible and usable by various stakeholders, including through APIs that facilitate data consumption. Design data integrations with internal and external products : Architect and implement data integration solutions that enable seamless data exchange between internal systems and external partners or products. This includes ensuring data integrity, security, and compliance with relevant standards. Build data visualizations to support analytic insights : Create intuitive and insightful data visualizations using tools like PowerBI, incorporating semantic layers to provide a unified view of data and help stakeholders understand complex data sets and derive actionable insights. Required Skills and Experience: Proficiency with Snowflake Ecosystem : Demonstrated ability to use Snowflake for data warehousing, including data ingestion, transformation, and querying. Proficiency in using Snowflakes features for scalable data processing, including the use of Snowpipe for continuous data ingestion and Snowflakes SQL capabilities for data transformation. Ability to optimize Snowflake performance through clustering, partitioning, and other best practices. Azure Data Factory (ADF): Experience in using ADF for orchestrating and automating data movement and transformation within the Azure ecosystem. Proficiency in programming languages such as SQL, NoSQL, Python, Java, R, and Scala: Strong coding skills in multiple programming languages used for data manipulation, analysis, and pipeline development. Experience with ETL (extract, transform, and load) systems and API integrations: Expertise in building and maintaining ETL processes to consolidate data from various sources into centralized repositories, and integrating APIs for seamless data exchange. Understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows : You should have a comprehensive knowledge of designing and implementing data systems that support various analytic and operational use cases, including data storage, processing, and retrieval. Basic understanding of machine learning concepts to support data scientists on the team: Familiarity with key machine learning principles and techniques to better collaborate with data scientists and support their analytical models. Strong problem-solving skills and ability to work as part of a technical, cross-functional analytics team : Excellent analytical and troubleshooting abilities, with the capability to collaborate effectively with team members from various technical and business domains. Skilled in creating data products that support analytic solutions: Proficiency in developing data products that enable stakeholders to derive meaningful insights and make data-driven decisions. This involves creating datasets, data models, and data services tailored to specific business needs. Experience in working with APIs and understanding data structures to serve them: Experience in designing, developing, and consuming APIs for data access and integration. This includes understanding various data structures and formats used in API communication. Knowledge of managing sensitive data, ensuring data privacy and security: Expertise in handling sensitive data with strict adherence to data privacy regulations and security best practices to protect against unauthorized access and breaches. Agile learner with a passion for solving complex data problems and delivering insights: A proactive and continuous learner with enthusiasm for addressing challenging data issues and providing valuable insights through innovative solutions. Experience with CPG Companies and POS Data: Experience in analyzing and interpreting POS data to provide actionable insights for CPG companies, enhancing their understanding of consumer behavior and optimizing sales strategies. Knowledge and Experience Bachelor s degree in management information systems/technology, Computer Science, Engineering, or related discipline. MBA or equivalent is preferred. 7+ years of experience in designing large-scale data solutions, performing design assessments, crafting design options and analysis, finalizing preferred solution choice working with IT and Business stakeholders. 5+ years of experience tailoring, configuring, and crafting solutions within the Snowflake environment, including a profound grasp of Snowflakes data warehousing capabilities, data architecture, SQL optimization for Snowflake, and leveraging Snowflakes unique features such as Snowpipe, Streams, and Tasks for real-time data processing and analytics. A strong foundation in data migration strategies, performance tuning, and securing data within the Snowflake ecosystem is essential. 3+ years demonstrated expertise in architecting solutions within the Snowflake ecosystem, adhering to best practices in data architecture and design patterns. 7+ years of data engineering or design experience, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting. Expertise in data modeling principles/methods including, Conceptual, Logical Physical Data Models for data warehouses, data lakes and/or database management systems. 5+ years of hands-on experience designing, building, and operationalizing data solutions and applications using cloud data and analytics services in combination with 3rd parties. 7+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). 7+ years of experience with database development and scripting. Professional Skills: Strong communication and interpersonal skills. Strong analytical and problem-solving skills and passion for product development. Strong understanding of Agile methodologies and open to working in agile environments with multiple stakeholders. Professional attitude and service orientation; team player. Ability to translate business needs into potential analytics solutions. Strong work ethic, ability to work at an abstract level and gain consensus. Ability to build a sense of trust and rapport to create a comfortable and effective workplace. Self-starter who can see the big picture, prioritize work to make the largest impact on the business and customers vision and requirements. Fluency in English.

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

New Payment Flows (NPF) division s charter is to capture new sources of money movement through card and non-card flows, including Visa Business Solutions, Government Solutions and Visa Direct which presents an enormous growth opportunity. Our team brings payment solutions and associated services to clients around the globe. Our global clients and partners deploy our solutions to serve the needs of Small Businesses, Middle Market Clients, Large Corporate Clients, Multi Nationals and Governments. The Visa Business Solutions (VBS) and Visa Government Solutions (VGS) team is a world-class technology organization experiencing tremendous, double-digit growth as we expand products into new payment flows and continue to grow our core card solutions. This is an incredibly exciting team to join as we expand globally. Responsibilities : Design and develop critical systems with high-availability and high performance Design and develop scalable backend applications using Java, Spring Boot, and other relevant technologies Develop front-end applications using frameworks such as Angular, React, or Vue.js Design, code and integrate n-tier applications with different application components Collaborate with cross-functional teams to define, design, and ship new features Responsible for design and development of different products utilizing Visas huge data. Interested in Innovation and ideation. Contribute to development of new products by building quick POCs and converting ideas into real products. Experience on Dockers,Kubernets, Openshift, IaaS hybrid cloud deployment and Infrastructure/production support and ITIL process Experience in in Kubernetes Administration and OpenShift Infrastructure and Architecture Collaborate with business and technology stakeholders to deliver high quality products and services that meet business requirements and exceeds expectations while applying the latest available tools and technology. Have a passion for delivering zero defect code. Help developer efficiencies by utilizing Continuous Integration/Development tools, test automation frameworks and other related items. Effectively communicate status, issues, and risks in a precise and timely manner Conduct code reviews and ensure adherence to best practices Work in Agile/Scrum teams and follow the guidelines. Create documentation and procedures for installation and maintenance. Collaborate with global and virtual teams on software development. Identify opportunities for future enhancements and refinements to products, standards, best practices and development methodologies Provide coaching and mentoring to junior team members Stay updated with emerging technologies and industry trends This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Basic Qualifications: 5+ years of relevant work experience with a Bachelor s Degree or at least 2 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 0 years of work experience with a PhD, OR 8+ years of relevant work experience. Preferred Qualifications: 7.5+ years of relevant experience in software design, architecture, and development lifecycle 5+ years of relevant work experience with a Bachelor s Degree or at least 2 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 0 years of work experience with a PhD, OR 8+ years of relevant work experience. Proven experience in full-stack software development for large-scale, mission-critical applications Strong expertise in n-tier web application development using Java/J2EE or equivalent frameworks Backend Development Skills Extensive experience with Spring or Spring Boot, Spring MVC, JPA, and Hibernate frameworks Proven expertise with RDBMS systems including SQL Server, Oracle, or DB2 Proficiency in Web Services/API development using SOAP or REST, JSON, and XML Experience with DevOps tools including GIT/Stash, Maven, and Jenkins Frontend Development Skills Strong experience in UI/Web development using Angular/React, JavaScript, jQuery, and HTML/CSS Ability to create responsive, user-friendly interfaces that meet business requirements Additional Technical Skills Experience collaborating with security teams to ensure compliance with security standards Knowledge of big data technologies (Spark, Scala, HDFS, Hive) is a plus Leadership & Professional Skills Experience leading a development module/team or mentoring junior developers is a plus Excellent problem-solving skills with strong attention to detail Demonstrated ability to deliver zero-defect code that meets or exceeds defect SLAs Strong sense of accountability for quality and timeliness of deliverables Ability to manage multiple projects and adjust priorities based on changing requirements Experience working in Agile/Scrum environments and following established processes Excellent presentation, collaboration, and communication skills

Posted 1 week ago

Apply

12.0 - 15.0 years

50 - 55 Lacs

Chennai

Work from Office

Naukri logo

Lead platform focused, scrum teams. Design and develop solutions to complex use-cases and issues. Serve as a technical expert on the design and architecture Develop talent and grow your team into a high performing healthy development organization You will leverage your leadership to support your team in design, implementation and release of the new experiences you create. Serve as an escalation point for internal customer support and product teams As an engineering manager you will be expected to create a healthy and inclusive team culture consistent with Providence Values by modeling, caring and coaching Lead strategic planning to achieve business goals by identifying and prioritizing development initiatives that aligns with Freshworks goals Develop successful long-term code/service architectures that scales Define, measure and monitor the engineering solutions, continuously innovate to provide seamless and modern experience Setup solid engineering process and driving engineering excellence ie code line management, code reviews standards, continuous integration/ continuous deployment workflows and tools. Review and drive engineering process improvements, make decisions about engineering tools and processes Being a technical leader, making decisions without higher level validation and supporting the team Qualifications Experience in leading technical teams in software engineering with platform mindset Strong experience in leading engineering teams shipping and deploying product to customers Passionate with the architectural design of large-scale, Design Patterns & SOLID principles, highly reliable, and highly available, especially around Microservices/API integrations, integrations with 3rd party systems. Expertise in one or more programming languages like Java, RDBMS (like MySQL/Oracle). Hands on experience in API Gateway, Kafka Cloud/SaaS experience, Infra knowledge of popular internet serving applications in AWS (preferred) or any other cloud and saas providers. Excellent problem-solving skills with a solid understanding data structures and algorithms Excellent interpersonal skills and can work with teams workings across various remote places

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies