Jobs
Interviews

1198 Normalization Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.5 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production-ready quality. You will apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Lead research and development efforts in AI/ML technologies. - Implement and optimize machine learning models. - Conduct data analysis and interpretation for business insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 12 years of experience in Machine Learning. - This position is based at our Bhubaneswar office. - A 15 years full-time education is required.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, and GenAI models. You will work on cloud or on-prem application pipelines with production-ready quality, incorporating deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop applications and systems using AI tools and Cloud AI services. - Apply GenAI models as part of the solution. - Implement deep learning and neural networks. - Create chatbots and work on image processing. - Collaborate with the team to provide innovative solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms like linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Machine Learning. - This position is based at our Hyderabad office. - A 15 years full-time education is required. 15 years full time education

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, and GenAI models. Your role involves implementing deep learning, neural networks, chatbots, and image processing in production-ready quality solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop applications and systems using AI tools and Cloud AI services. - Implement deep learning and neural networks in solutions. - Create chatbots and work on image processing tasks. - Collaborate with team members to provide innovative solutions. - Stay updated with the latest AI/ML trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms like linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques including data cleaning, transformation, and normalization. Additional Information: - The candidate should have a minimum of 3 years of experience in Large Language Models. - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education

Posted 3 weeks ago

Apply

7.5 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Salesforce Einstein AI Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, with a proper cloud or on-prem application pipeline of production-ready quality. You will apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Lead the implementation of AI/ML models. - Conduct research on emerging AI technologies. - Optimize AI algorithms for performance and scalability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Salesforce Einstein AI. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Salesforce Einstein AI. - This position is based at our Bengaluru office. - A 15 years full-time education is required.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Specialist, Development – Database will be responsible for design and development complex Database related software, tools and techniques for the bank. .This position demands excellent communication, problem-solving skills, and the ability to interact positively with stakeholders. Responsibilities Strategy Develop and implement software development strategies aligned with the organization's goals and objectives. Drive innovation and continuous improvement in software development practices. Ensure the adoption of best practices and emerging technologies in the banking domain. Business Collaborate with business stakeholders to understand their requirements and translate them into technical solutions. Ensure that software solutions meet business needs and deliver value to the organization. Support business growth by developing scalable and robust software applications. Processes Oversee the entire software development lifecycle, from requirement gathering to deployment and maintenance. Ensure adherence to Agile and Tribe model processes, including sprint planning, daily stand-ups, and retrospectives. Maintain clear and comprehensive documentation throughout the development process. Risk Management Lead, mentor, and develop a team of software developers, fostering a culture of collaboration and continuous learning. Governance Identify and mitigate risks associated with software development projects. Ensure compliance with industry standards and regulatory requirements. Implement robust testing and quality assurance processes to deliver error-free software. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead to achieve the outcomes set out in the Bank’s Conduct Principles Effectively and collaboratively identify, escalate, mitigate, and resolve risk, conduct and compliance matters. Key Stakeholders Our stakeholders include Product Owners (PO), Ops team, Business Partners, Engineering Lead (EL), Sub Domain Tech Leads (SDTL), Chapter Leads (CL), ICS, Production Support Teams, Process & Audit Teams, Integration Team, and Surround Interfacing Systems, such as upstream and downstream teams. Qualification Qualification: Bachelor’s or Master’s Degree Experience: 8 to 12 years Skill: Software Development Life Cycle (SDLC) Skills Preference: Relevant Skills Certifications Role Specific Technical Competencies SQL Proficiency, Database Design, Data Modelling, Normalization, Performance Tuning, Data Security, Backup and Recovery and Transaction Management, PL/SQL, Oracle RBDBMS Shell scripting, Data manipulation and automation knowledge ETL About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

On-site

Description About Norstella At Norstella, our mission is simple: to help our clients bring life-saving therapies to market quicker—and help patients in need. Founded in 2022, but with history going back to 1939, Norstella unites best-in-class brands to help clients navigate the complexities at each step of the drug development life cycle —and get the right treatments to the right patients at the right time. Each Organization (Citeline, Evaluate, MMIT, Panalgo, The Dedham Group) Delivers Must-have Answers For Critical Strategic And Commercial Decision-making. Together, Via Our Market-leading Brands, We Help Our Clients Citeline – accelerate the drug development cycle Evaluate – bring the right drugs to market MMIT – identify barrier to patient access Panalgo – turn data into insight faster The Dedham Group – think strategically for specialty therapeutics By combining the efforts of each organization under Norstella, we can offer an even wider breadth of expertise, cutting-edge data solutions and expert advisory services alongside advanced technologies such as real-world data, machine learning and predictive analytics. As one of the largest global pharma intelligence solution providers, Norstella has a footprint across the globe with teams of experts delivering world class solutions in the USA, UK, The Netherlands, Japan, China and India. Job Description We are seeking a Data Architect - RWD and Analytics Products to lead the design and implementation of scalable real-world data (RWD) solutions architecture. This role sits within the Product team but maintains strong collaboration with Engineering to ensure technical feasibility and execution. The ideal candidate has expertise in healthcare data, claims, EHR, lab and other types of RWD and is skilled in translating business needs into scalable, high-impact data products. This role will be instrumental in shaping data-driven products, optimizing data architectures, and ensuring the integration of real-world data assets into enterprise solutions that support life sciences, healthcare, and payer analytics. Responsibilities Define and drive the requirements for RWD data products. Collaborate with leadership, product managers, customers, and data scientists to identify high-value use cases. Translate business and regulatory requirements into scalable and performant data models and solutions. Develop architectures to support payer claims, labs, ehr-sourced insight generation and analytics. Partner with healthcare providers, payers, and life sciences companies to enhance data interoperability. Work closely with Engineering to design and implement responsive analytics layer and data architectures. Provide technical guidance on ETL pipelines, data normalization, and integration with third-party RWD sources. Architect solutions to aggregate, standardize, and analyze EHR and molecular data, ensuring compliance with healthcare regulations (HIPAA, GDPR). Define best practices for claims data ingestion, quality control, and data transformations. Develop frameworks for processing structured and unstructured EHR data, leveraging NLP and data harmonization techniques. Ensure compliance with HIPAA, GDPR, and regulatory frameworks for healthcare data products. Define and implement data governance strategies to maintain high data integrity and lineage tracking. Requirements Deep understanding of payer data, claims lifecycle, EHR, labs and real-world data applications. Ability to translate business needs into technical solutions and drive execution. Strong understanding of data product lifecycle and product management principles. Experience working with cross-functional teams, including Product, Engineering, Clinical, Business and Customer Success. Excellent communication skills to engage with both technical and non-technical stakeholders. Expertise in RWD and payer data structures (claims, EMR/EHR, registry data, prescription data, etc.). Proficiency in SQL and NoSQL databases (PostgreSQL, Snowflake, MongoDB, etc.). Strong knowledge of ETL processes and data pipeline orchestration. Experience with big data processing (Spark, Databricks, Hadoop). Understanding of payer and provider data models used in healthcare analytics. Strong presentation and documentation skills to articulate solutions effectively. Experience working with payer organizations, PBMs, life sciences, and health plans. Experience with OMOP, FHIR, HL7, and other healthcare data standards. Knowledge of data governance, metadata management, and lineage tracking tools. Experience in pharmaceutical RWE studies and market access analytics. Familiarity with BI tools (Tableau, Power BI, Looker). Understanding of data mesh and federated data architectures. Benefits Health Insurance Provident Fund Life Insurance Reimbursement of Certification Expenses Gratuity 24x7 Health Desk Our guiding principles for success at Norstella 01: Bold, Passionate, Mission-First We have a lofty mission to Smooth Access to Life Saving Therapies and we will get there by being bold and passionate about the mission and our clients. Our clients and the mission in what we are trying to accomplish must be in the forefront of our minds in everything we do. 02: Integrity, Truth, Reality We make promises that we can keep, and goals that push us to new heights. Our integrity offers us the opportunity to learn and improve by being honest about what works and what doesn’t. By being true to the data and producing realistic metrics, we are able to create plans and resources to achieve our goals. 03: Kindness, Empathy, Grace We will empathize with everyone's situation, provide positive and constructive feedback with kindness, and accept opportunities for improvement with grace and gratitude. We use this principle across the organization to collaborate and build lines of open communication. 04: Resilience, Mettle, Perseverance We will persevere – even in difficult and challenging situations. Our ability to recover from missteps and failures in a positive way will help us to be successful in our mission. 05: Humility, Gratitude, Learning We will be true learners by showing humility and gratitude in our work. We recognize that the smartest person in the room is the one who is always listening, learning, and willing to shift their thinking. Norstella is an equal opportunities employer and does not discriminate on the grounds of gender, sexual orientation, marital or civil partner status, pregnancy or maternity, gender reassignment, race, color, nationality, ethnic or national origin, religion or belief, disability or age. Our ethos is to respect and value people’s differences, to help everyone achieve more at work as well as in their personal lives so that they feel proud of the part they play in our success. We believe that all decisions about people at work should be based on the individual’s abilities, skills, performance and behavior and our business requirements. Norstella operates a zero tolerance policy to any form of discrimination, abuse or harassment. Sometimes the best opportunities are hidden by self-doubt. We disqualify ourselves before we have the opportunity to be considered. Regardless of where you came from, how you identify, or the path that led you here- you are welcome. If you read this job description and feel passion and excitement, we’re just as excited about you.

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Hello Connections, Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title: .NET FULL SYACK Developer · Location: chennai,pune,hyderabad,coimbatore,bangalore,Noida,kolkatta · Experience: 7+ Year to 9year(relevant in DOTNET FULL STACK Developer 6Year) · Job Type : Contract to hire.s Work Mode : Work from Office (5day) SHIFT TIMING-1-3.30PM TO 12.30 AM & 6.30PM TO2.30AM · Notice Period:- Immediate joiners(who can able to join july 3rd week) Mandatory Skills: .NET FULL STACK DEVELOPER(Relevant should be 6year) Have >6+ years hands on coding experience in .Net full stack as developer Have >5+ years hands on experience in ASP.Net, MVC, Web API, LINQ, Entity Framework as developer Hands on experience in SQL queries, Stored procedures writing, designing of table. Should have Working exp on Azure Devops, azure PAAS "Ready to work 2 days Night Shift and 3 Days Evening shift as per US timings. Shift 1 - 3:30 PM to 12:30 AM Shift 2 - 6:30 PM to 2:30 AM" Excellent in communication Roles and Responsibilities: Responsibilities: 1. Backend Development (ASP.NET Core / ASP.NET MVC / Web API) Design and develop scalable, maintainable, and secure RESTful APIs using ASP.NET Core. Implement business logic and middleware using C#. Build and consume Web APIs for client-server communication. Use dependency injection, middleware pipelines, and best architectural practices (like layered or clean architecture). 2. Frontend Development (Angular / React / Blazor / Razor Pages) Build interactive UIs using modern frontend frameworks. Integrate REST APIs with frontend components. Implement responsive design with Bootstrap, CSS, and HTML5. Handle state management, routing, and validation on the frontend. 3. Database Development (SQL Server) Design relational databases and write complex queries, stored procedures, and triggers. Perform database tuning, indexing, normalization, and optimization. Integrate Entity Framework or Dapper for ORM-based data access. 4. Azure DevOps (CI/CD, Pipelines, Repos, Boards) Set up and maintain CI/CD pipelines for automated build, test, and deployment. Use Azure Repos for version control (Git). Create and manage Azure DevOps Boards for Agile/Scrum project tracking. Use release pipelines to deploy to Azure App Services or other environments. 5. Azure PaaS (Platform as a Service) Deploy applications to Azure App Services , Function Apps , and Azure SQL . Manage and configure App Service Plans , custom domains, SSL, scaling rules. Use Azure Key Vault for secrets management. Implement Application Insights and Log Analytics for monitoring and diagnostics. Integrate with Azure Blob Storage , Service Bus , Azure Active Directory , etc.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

New Delhi, Delhi, India

Remote

DHIRA Company Overview DHIRA is a leading company specializing in intelligent transformation, where we leverage advanced AI/ML and data-driven solutions to revolutionize business operations. Unlike traditional digital transformation, which focuses on transaction automation, our intelligent transformation encompasses both transactional automation and deep analytics for comprehensive insights. Our expertise in data engineering, data quality, and master data management ensures robust and scalable AI/ML applications. Utilizing cutting-edge technologies across AWS, Azure, GCP, and on-premises Hadoop systems, we deliver efficient and innovative data solutions. Our vision is embodied in the Akashic platform, designed to provide seamless, end-to-end analytics. At DHIRA, we are committed to excellence, driving impactful contributions to the industry. Join us to be part of a dynamic team at the forefront of intelligent transformation Role- Data Architect – Evolution of Databases, Data Modeling, and Modern Data Practices Location : Bangalore, Remote Position Overview: We are seeking a Principal Data Architect with 5+ years of experience who has a comprehensive understanding of the evolution of databases , from OLTP to OLAP, and relational systems to NoSQL, Graph, and emerging Vector Databases . This role requires deep expertise in data modeling , from traditional ER modeling to advanced dimensional, graph, and vector schemas, along with a strong grasp of the history, best practices, and future trends in data management. The ideal candidate will bring both historical context and cutting-edge expertise to architect scalable, high-performance data solutions, driving innovation while maintaining strong governance and best practices. This is a leadership role that demands a balance of technical excellence, strategic vision, and team mentorship. Key Responsibilities: 1. Data Modeling Expertise: – Design and implement Entity-Relationship Models (ER Models) for OLTP systems, ensuring normalization and consistency. – Transition ER models into OLAP environments with robust dimensional modeling, including star and snowflake schemas. – Develop hybrid data models that integrate relational, NoSQL, Graph, and Vector Database schemas. – Establish standards for schema design across diverse database systems, focusing on scalability and query performance. 2. Database Architecture Evolution: – Architect solutions across the database spectrum: • Relational databases (PostgreSQL, Oracle, MySQL) • NoSQL databases (MongoDB, Cassandra, DynamoDB) • Graph databases (Neo4j, Amazon Neptune) • Vector databases (Pinecone, Weaviate, Milvus). – Implement hybrid data architectures combining OLTP, OLAP, NoSQL, Graph, and Vector systems for diverse business needs. – Ensure compatibility and performance optimization across these systems for real-time and batch processing. 3. Data Warehousing and Analytics: – Lead the development of enterprise-scale Data Warehouses capable of supporting advanced analytics and business intelligence. – Design high-performance ETL/ELT pipelines to handle structured and unstructured data with minimal latency. – Optimize OLAP systems for petabyte-scale data storage and low-latency querying. 4. Emerging Database Technologies: – Drive adoption of Vector Databases for AI/ML applications, enabling semantic search and embedding-based queries. – Explore cutting-edge technologies in data lakes, lakehouses, and real-time processing systems. – Evaluate and integrate modern database paradigms, ensuring scalability for future business requirements. 5. Strategic Leadership: – Define the organization’s data strategy , aligning with long-term goals and emerging trends. – Collaborate with business and technical stakeholders to design systems that balance transactional and analytical workloads. – Lead efforts in data governance, ensuring compliance with security and privacy regulations. 6. Mentorship and Innovation: – Mentor junior architects and engineers, fostering a culture of learning and technical excellence. – Promote innovation by introducing best practices, emerging tools, and modern methodologies in data architecture. – Act as a thought leader in database evolution, presenting insights to internal teams and external forums. Required Skills & Qualifications: • Experience: – 6+ years of experience in data architecture, with demonstrated expertise across OLTP, OLAP, NoSQL, Graph, and Vector databases. – Proven experience designing and implementing data models across relational, NoSQL, graph, and vector systems. – A strong understanding of the evolution of databases and their impact on modern data architectures. • Technical Proficiency: – Deep expertise in ER modeling , dimensional modeling, and schema design for modern database systems. – Proficient in SQL and query optimization for relational and analytical databases. – Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB. – Strong knowledge of Graph databases (Neo4j, Amazon Neptune) and Vector databases (Pinecone, Milvus, or Weaviate). – Familiarity with modern cloud-based DW platforms (e.g., Snowflake, BigQuery, Redshift) and lakehouse solutions. • Knowledge of Data Practices: – Historical and practical understanding of data practices, from schema-on-write to schema-on-read approaches. – Experience in implementing real-time and batch processing systems for diverse workloads. – Strong grasp of data lifecycle management, governance, and security practices. • Leadership and Communication: – Ability to lead large-scale data initiatives, balancing technical depth and strategic alignment. – Excellent communication skills to articulate complex ideas to technical and non-technical audiences. – Proven ability to mentor and upskill teams, fostering a collaborative environment. Preferred Skills: • Experience integrating Vector Databases into existing architectures for AI/ML workloads. • Knowledge of real-time streaming systems (Kafka, Pulsar) and their integration with modern databases. • Certifications in data-related technologies (e.g., AWS, GCP, Snowflake, Neo4j). • Hands-on experience with BI tools (e.g., Tableau, Power BI) and AI/ML platforms.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Location :: Bangalore , Hyderabad , Pune , Indore Shift Timing 2.00 PM - 11.00 PM Experience 8-10 Years Location - Bengaluru, Hyderabad, Pune, Indore. Job Description 1. At least 5-8 Years of Experience in Design, Development, Implementation, Upgrade, Customization and Maintenance of Oracle EBS covering Complete System. Execute Techno Functional role on Oracle EBS. 2. Contribute to the development and customization of reports, forms, interfaces, concurrent programs, PL/SQL,Experience in developing Reports using XML/BI Publisher and WorkFlow. 3. Developed and customized RICE components like Reports, Forms, Interfaces, Conversions, Extensions as per client requirements. 4. Strong programming experience in developing Procedures, Packages, APIs,Functions, Triggers, and other database objects using SQL and PL/SQL. 5. Utilize Oracle Apps Tech-no Functional Knowledge to address challenges in Financials, SCM Modules. **Required Skills and Qualifications:** 1. Proficiency in Oracle (SQL/PLSQL), Oracle Builder 10g (Reports, Forms), SQL*Loader, XML Publisher, UTL_FILE Utilities, and experience in report development and customization. 2. Strong grasp of Work Flow. 3. Extensive experience in developing new reports and customizing existing reports. 4. Thorough understanding of Oracle EBS and the ability to work effectively as part of a collaborative team. 5. Proven track record in handling complex queries, exception handlers, and technical documentation preparation. 6. Effective stakeholder management including the internal as well as external stakeholders for the project 7. Lead a team to deliver the expected business results 8. Execute unit testing to ensure the sanity of his/her developments. 9. Realistically estimate the requirements from an effort and cost perspective. 10. Realize the development/requirement as per the technical specification provided. 11. Adhere to all coding standards and performance measures, as per customer/organization guidelines. 12. Coordinate with the functional team for requirement gathering/understanding and convert functional specifications to technical specifications. 13. Handle Internal and External stakeholders of projects. 14. Mentor junior consultants on the team and provide assistance if needed. 15. Participate in testing activities along with functional counterparts and provide quick resolutions. 16. Coordinate with functional/business POCs for requirement analysis and to provide appropriate technical solutions. Good-to-Have 1.Hands on experience in developing Technical Specification Documents, Gap Analysis, Application Design (MD, CV) using AIM Methodology. 2.Expertise in performing Data export, Import and various operations using TOAD/SQL Developer. 3.Expertise in troubleshooting and query performance tuning by using EXPLAIN_PLAN, SQL Trace and TKPROF Utilities, hints provided by oracle. 4.Good understanding of oracle Data Dictionary and Normalization Techniques. 5.Excellent team player with Good Communication and Interpersonal skills.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Noida, Uttar Pradesh

On-site

Position: Web Developer We are looking for a highly skilled Web Developer with 1 years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 1 years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹15,000.00 - ₹20,000.00 per month Application Question(s): Are you okay with our budget i.e 18k-20k Per Month ? Experience: core php: 1 year (Required) Laravel: 1 year (Required) WordPress: 1 year (Preferred) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Key Responsibilities Develop and maintain scalable server-side applications using Node.js, Express.js, and TypeScript Design robust and secure RESTful APIs with proper routing, middleware, and error handling Build and optimize relational database schemas using PostgreSQL, ensuring performance, normalization, and data integrity Integrate and manage ORMs like Prisma or TypeORM for efficient and type-safe database operations Implement authentication and authorization using JWT, session-based methods, and OAuth protocols Validate request and response data using Zod or Joi to ensure type safety and data integrity Handle file uploads and media storage using Multer, and integrate with Cloudinary, AWS S3, or similar services Write clean, testable, and modular code following SOLID principles Create and maintain API documentation using tools like Postman or Swagger Implement security best practices such as input sanitization, rate limiting, secure headers, and CORS configuration Perform unit and integration testing using Jest and Supertest Collaborate closely with frontend developers to define and deliver seamless API experiences Manage deployments using platforms like Vercel, Render, Railway, DigitalOcean, or AWS (EC2/S3) Configure CI/CD pipelines using GitHub Actions, PM2, or Docker for automated builds and deployments Handle environment configuration securely using .env files and secret managers Work with version control (Git) to manage codebase, branches, and code reviews Monitor and debug production issues, ensuring application reliability and performance Build real-time features using WebSockets or Socket.IO (optional) Requirements Node.js (event loop, async/await, non-blocking architecture) Express.js (middleware, routing, error handling) TypeScript (interfaces, generics, type safety) PostgreSQL (schema design, joins, indexing, ACID compliance) ORMs: Prisma or TypeORM (for PostgreSQL with TypeScript) MongoDB (CRUD, aggregation, indexing, geospatial queries) – good to have Authentication: JWT, session-based auth, OAuth Authorization (role-based access control) Data validation (Zod, Joi) File uploads and storage (Multer, Cloudinary, AWS S3) WebSockets / Socket.IO – good to have API documentation (Postman, Swagger) Git (branching, commits, PRs) Basic testing (Jest, Supertest) Security best practices (rate limiting, CORS, sanitization, encryption) CI/CD knowledge (GitHub Actions, Docker, PM2) Deployment experience: Vercel, Render, Railway, DigitalOcean, or AWS (EC2/S3) Environment variable management and secure credential handling Cloud platform familiarity – AWS, Vercel, or GCP – good to have Clean code, modular structure, and strong debugging skills About Company: We are a photography company based in Noida, operating across India and internationally. Our primary services include wedding and pre-wedding shoots, maternity photoshoots, newborn photography, birthday and pre-birthday shoots, as well as corporate and event coverage. To learn more about our work, visit us at www.theimpressio.com and www.theimpressio.in.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title: Senior Engineer – Data SQL Engineer Corporate Title: AVP Location: Pune, India Role Description As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your Skills And Experience 10+ years of hands-on experience with SQL in relational databases – SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable Skills That Will Help You Excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Experience in GCP, Cloud Database Migration experience, hands-on with Postgres How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Consultant - Application Development Introduction To Role Are you ready to redefine an industry and change lives? We are seeking a skilled SQL Developer and Support Specialist to design, develop, and maintain efficient databases while providing technical support to ensure the smooth operation of database systems. This is your chance to work inclusively in a diverse team, inspiring change and making a real difference. Collaborate with multi-functional teams to optimize database performance, write complex SQL queries, and solve database-related issues. Join us at a crucial stage of our journey in becoming a digital and data-led enterprise! Accountabilities SQL Development Design, develop, and optimize SQL queries, stored procedures, functions, and scripts for database management and reporting. Create and maintain database schemas, tables, views, indexes, and other database objects. Collaborate with software developers to integrate database solutions with applications. Develop, implement, and maintain ETL (Extract, Transform, Load) pipelines for data integration. Analyze and optimize database performance, including query tuning and indexing strategies. Ensure data integrity, consistency, and security in all database systems. Database Support Monitor and maintain database systems to ensure high levels of performance, availability, and security. Troubleshoot and resolve database-related issues, including performance bottlenecks and errors. Conduct root cause analysis for database-related support incidents and implement preventive measures. Provide technical support to end-users and stakeholders, addressing queries and issues related to database functionality and reports. Collaborate with IT and infrastructure teams to ensure regular database backups and disaster recovery plans are in place. Assist in database migrations, upgrades, and patching activities. Documentation and Training Create and maintain technical documentation, including database architecture, data models, and workflows. Train team members and end-users on database standard methodologies and reporting tools. Essential Skills/Experience Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as an SQL Developer or in a similar role. Proficiency in SQL programming and database management systems (e.g., SQL Server, Oracle, MySQL, PostgreSQL). Solid understanding of database design, normalization, and data modeling concepts. Hands-on experience with ETL tools and data integration. Familiarity with performance tuning, query optimization, and indexing strategies. Experience with database monitoring and troubleshooting tools. Knowledge of scripting languages (e.g., Python, PowerShell) is a plus. Understanding of data security, backup, and recovery practices. Excellent problem-solving skills and attention to detail. Good communication and collaboration skills. Key Traits Analytical attitude with a proactive approach to problem-solving. Ability to work in a fast-paced, collaborative environment. Strong organizational and time-management skills. Desirable Skills/Experience Preferred Qualifications Experience with BI tools such as Power BI, Tableau, or SSRS (SQL Server Reporting Services). Familiarity with cloud database solutions (e.g., Azure SQL, Amazon RDS). Knowledge of DevOps practices and CI/CD pipelines for database deployments. Certification in SQL or database technologies (e.g., Microsoft Certified: Azure Data Engineer, Oracle Database Administrator). When we put unexpected teams in the same room, we ignite bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and bold world. At AstraZeneca, our work has a direct impact on patients by redefining our ability to develop life-changing medicines. We empower the business to perform at its peak by combining brand new science with leading digital technology platforms. With a passion for data analytics, AI, machine learning, and more, we drive cross-company change to redefine the entire industry. Here you can innovate, take ownership, experiment with pioneering technology, and begin challenges that might never have been addressed before. Be part of a team that has the backing to innovate and change lives! Ready to make an impact? Apply now to join our dynamic team! Date Posted 07-Jul-2025 Closing Date 06-Aug-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Skill required: Insurance Services - Business Intelligence (BI) Reporting Tools Designation: Measurement & Report Analyst Qualifications: Bachelors of Information Technology Years of Experience: 3 to 5 years Language - Ability: English(International) - Advanced About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? The individual in this position will provide reporting and analytical insight to support to the Client Services Group by means of standardizing and automating operational and performance metrics and reporting. We’re in search for a seasoned analyst/developer with experience in providing reporting, but also telling a story through analytics. They will be responsible for, building dashboards, and solving analytical problems. This candidate will need to have an expert-level grasp of data and database technologies and be able to develop visualizations that assist business leadership in understanding patterns, dependencies and interconnections of key metrics. The qualified candidate will be a highly motivated individual who has a passion for developing innovative/creative solutions in a fast-paced environment. This individual must be a self-starter who possesses exceptional technical and critical thinking skills, the ability to work independently, and must have excellent oral/written communication skills. What are we looking for? Structured Query Language (SQL) Tableau PHP (Programming Language) Problem-solving skills Detail orientation Commitment to quality Strong analytical skills Ability to meet deadlines Alteryx Python (Programming Language) Roles and Responsibilities: Develop business intelligence reporting for the Individual Client Solutions production teams and supporting areas, including historical data trend analysis and visualizations that identify process improvement opportunities Using a combination of database and web technologies to deliver web-based self-service interfaces ranging from corporate-level scorecard metrics down to the performance of individual client services representatives Partner with business areas to define critical reporting needs and provide guidance regarding the availability of data and set expectations around delivery of results Work with our IT partners in establishing access to data sources, obtaining definitions of data sets and ensuring the reliability and accuracy of information Design and execute data extract/transform/load (ETL) scripts and environment migrations Ad-hoc analyses that span multiple databases of potentially mixed levels of normalization and “cleanliness” Provide support for development and infrastructure issues surrounding the availability of reporting platforms Prioritize the development, maintenance and support of multiple projects simultaneously Find creative and simple solutions to complex reporting problems Assist with special projects, data maintenance, and other day-to-day tasks Analyze data and present results to both technical and non-technical audiences Effectively manage workload by prioritizing ad-hoc requests, development efforts and analysis Navigate cross-functional units in order to identify information dependencies

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionises customer engagement by transforming contact centres into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organisations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. Position Overview: We seek an experienced Staff Software Engineer to lead the design and development of our data warehouse and analytics platform in addition to helping raise the engineering bar for the entire technology stack at Level AI, including applications, platform, and infrastructure. They will actively collaborate with team members and the wider Level AI engineering community to develop highly scalable and performant systems. They will be a technical thought leader who will help drive solving complex problems of today and the future by designing and building simple and elegant technical solutions. They will coach and mentor junior engineers and drive engineering best practices. They will actively collaborate with product managers and other stakeholders both inside and outside the team. What you’ll get to do at Level AI (and more as we grow together): Design, develop, and evolve data pipelines that ingest and process high-volume data from multiple external and internal sources Build scalable, fault-tolerant architectures for both batch and real-time data workflows using tools like GCP Pub/Sub, Kafka and Celery Define and maintain robust data models with a focus on domain-oriented design, supporting both operational and analytical workloads Architect and implement data lake/warehouse solutions using Postgres and Snowflake Lead the design and deployment of workflow orchestration using Apache Airflow for end-to-end pipeline automation Ensure platform reliability with strong monitoring, alerting, and observability for all data services and pipelines Collaborate closely with Other internal product & engineering teams to align data platform capabilities with product and business needs Own and enforce data quality, schema evolution, data contract practices, and governance standards Provide technical leadership, mentor junior engineers, and contribute to cross-functional architectural decisions We'd love to explore more about you if you have 8+ years of experience building large-scale data systems; preferably in high-ingestion, multi-source environments Strong system design, debugging, and performance tuning skills Strong programming skills in Python and Java Deep understanding of SQL (Postgres, MySQL) and data modeling (star/snowflake schema, normalization/denormalization) Hands-on experience with streaming platforms like Kafka and GCP Pub/Sub Expertise with Airflow or similar orchestration frameworks Solid experience with Snowflake, Postgres, and distributed storage design Familiarity with Celery for asynchronous task processing Comfortable working with ElasticSearch for data indexing and querying Exposure to Redash, Metabase, or similar BI/analytics tools Proven experience deploying solutions on cloud platforms like GCP or AWS Preferred Attributes- Experience with data governance and lineage tools. Demonstrated ability to handle scale, reliability, and incident response in data systems. Excellent communication and stakeholder management skills. Passion for mentoring and growing engineering talent. To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/ Our AI platform : https://www.youtube.com/watch?v=g06q2V_kb-s Compensation: We offer market-leading compensation, based on the skills and aptitude of the candidate.

Posted 3 weeks ago

Apply

10.0 years

15 - 18 Lacs

Pune

On-site

PHP (Laravel, CodeIgniter) Must have led backend module development using Laravel or CodeIgniter. Should be able to independently build, extend, and maintain enterprise-grade applications. Modern PHP (PHP 8) Should have used PHP 8 features like union types, named arguments, and attributes in at least one production project. Demonstrated familiarity with PHP 8 syntax and capabilities. Object-Oriented Programming (OOP) Should have strong command over inheritance, traits, method overriding, and abstract classes. Able to design reusable, modular, and extensible components. JavaScript (jQuery) Must have used jQuery for DOM manipulation, AJAX handling, form validation, and interactive frontend features within PHP applications. CSS Should be able to manage layout styling, apply responsive designs, and collaborate with frontend teams for consistent UI delivery. MySQL Should be proficient in writing optimized queries, applying indexes, and resolving performance issues in relational databases. Experience with large data sets is expected. ORM (Eloquent / Doctrine) Should have used Eloquent or Doctrine for basic CRUD, relationships, and query abstraction. Deep optimization or customization is not mandatory. RESTful API Development Should have experience designing and consuming APIs with JSON. Must be able to implement authentication, handle errors, and manage data formatting. Microservices (API-first) Should understand modular service concepts and how services communicate over REST. Prior experience integrating PHP modules via APIs is required. Senior PHP Developer – JD (10+ Years, Multiple Locations, Hybrid) SQL Concepts Must understand the role and differences of primary keys, unique keys, joins, and normalization. Should apply these effectively in schema design and debugging. Job Types: Full-time, Contractual / Temporary, Freelance Contract length: 12 months Pay: ₹130,000.00 - ₹150,000.00 per month Benefits: Health insurance Schedule: Monday to Friday Work Location: In person

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Introduction Joining the IBM Technology Expert Labs teams means you will have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you will bring together all the necessary technology and services to help customers adopt IBM software to solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best — running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators — always willing to help and be helped — as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Required Technical And Professional Expertise 4-6 years of experience in data wharehouse, data engineering, or a similar role, with hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in installation and configuration of DB2 databases Excellent communication, collaboration and problem-solving skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Introduction Joining the IBM Technology Expert Labs teams means you will have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you will bring together all the necessary technology and services to help customers adopt IBM software to solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best — running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators — always willing to help and be helped — as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Required Technical And Professional Expertise 4-6 years of experience in data wharehouse, data engineering, or a similar role, with hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in installation and configuration of DB2 databases Excellent communication, collaboration and problem-solving skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title Tech Delivery Lead Role Summary & Role Description The lead is responsible for providing strong tech design and implementation platform for the Market Data Hub (MDH), and will be responsible for End to End delivery of the project, ensuring the strategy supports the current and future business needs. Lead implementations to provide scalable design, state of the art Development & engineering practices, Automation for Quality assurance & Maintenance in production. Work closely with global & regional Business and Technology stakeholders Participate in org wide initiatives Documenting the detailed data & application architecture for both current and target state Understand and implement data privacy requirements Understand the application architecture and data flow/transformation Capturing the logical & physical data models for both current state and the target state Setting the strategy for architecture to support the Business and IT strategies while maintaining the data architecture principles Lead architecture governance for the portfolio and provide subject matter expert inputs to design decisions across teams within the portfolio Manage holistic roadmap of architecture change initiatives across the coordinating requirements across different initiatives Be a key stakeholder and advisor in all new strategic data initiatives and ensure alignment to the enterprise wide data strategy Build a framework of principles to ensure data integrity across the business Build and maintain appropriate Data Architecture artifacts including; Entity Relationship Models, Data dictionary, taxonomy to aid data traceability Provide technical oversight to Solution Architects in creating business driven solutions adhering to the enterprise architecture and data governance standards Develop key performance measures for data integration and quality Support third party data suppliers in developing specifications that are congruent with the Enterprise data architecture Act on ad-hoc duties as assigned. Core/Must Have Skills Proficiency in Application architecture, Engineering practices & Big data implementations Should be able to develop, maintain, and optimize data pipelines and workflows using Databricks. Strong engineering skills, including knowledge of languages such as Java (Hive, Apache, Hadoop) and Scala (Apache Spark, Kafka). Understanding of Data Management tools, to mine data, data masking techniques, automate test data generation for test execution Strong in Market and reference Data domain Strong understanding of data pipeline (extract, transform and load) processes and the supporting technologies such as Oracle PL/SQL, Unix Shell Scripting, Python, Hadoop Spark & Scala, Databricks, AWS, Azure etc. Excellent problem solving and data modelling skills (logical, physical, sematic and integration models) including normalization, OLAP / OLTP principles and entity relationship analysis Experience of creating and implementing data strategies that align with business objectives. Excellent communication and presentational skills, confident and methodical approach, and able to work within a team environment. Enthusiastic and proactive with the strive to “make things happen” Ability to identify gaps in processes and introduce new tools and standards to increase efficiency and productivity Ability to work to deadlines in a fast paced environment Ability to take ownership and initiative Self-motivated and ability to influence others Good To Have Skills Domain Knowledge on Market Data Work Schedule Hybrid Keywords (If any) Job ID: R-769091

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction Joining the IBM Technology Expert Labs teams means you will have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you will bring together all the necessary technology and services to help customers adopt IBM software to solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best — running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators — always willing to help and be helped — as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Required Technical And Professional Expertise 4-6 years of experience in data wharehouse, data engineering, or a similar role, with hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in installation and configuration of DB2 databases Excellent communication, collaboration and problem-solving skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process

Posted 3 weeks ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description Talent Acquisition-Quality of hiring, Periodic review with TAC, intervention for speedy closure for urgent requirements. Gender Diversity through role identification for female hiring. Talent Management – Identification and Retention of Hi Pos, tracking Carrer progression of Star Leads and Hi Pos. Engagement and Retention of female employees and their career progression. PMS: Drive end to end Performance Review process such as KRA setting, Midterm review & Annual review. Also play critical role in Normalization and promotion. Preparation of HR deck for ABP-Zero based salary projection and resource planning HR Analytics: Analyze manpower & attrition trends, understand key reasons for attrition and its triggers and suggest corrective action plan for better retention. L&D – Nominate employees based on training needs identified through IDP and identify, conceptualize and drive Business specific need training program. Employee Connect – Periodic site visit, Audit Key Responsibilities Should have experience of handling similar role of HRBP

Posted 3 weeks ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description Talent Acquisition-Quality of hiring, Periodic review with TAC, intervention for speedy closure for urgent requirements. Gender Diversity through role identification for female hiring. Talent Management – Identification and Retention of Hi Pos, tracking Carrer progression of Star Leads and Hi Pos. Engagement and Retention of female employees and their career progression. PMS: Drive end to end Performance Review process such as KRA setting, Midterm review & Annual review. Also play critical role in Normalization and promotion. Preparation of HR deck for ABP-Zero based salary projection and resource planning HR Analytics: Analyze manpower & attrition trends, understand key reasons for attrition and its triggers and suggest corrective action plan for better retention. L&D – Nominate employees based on training needs identified through IDP and identify, conceptualize and drive Business specific need training program. Employee Connect – Periodic site visit, Audit Key Responsibilities Should have experience of handling similar role of HRBP

Posted 3 weeks ago

Apply

7.0 - 13.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

OT Security Consultant (Asset Management) Job location - Mumbai Experince - 7-13 Years Create and maintain a comprehensive OT asset inventory and ensure real-time visibility across network-connected and legacy systems. Integrate asset data with cybersecurity platforms and support vulnerability and compliance initiatives through detailed tracking. Automated and manual asset discovery using SNMP, Modbus, OPC protocols Configuration baseline analysis and anomaly tracking CMDB integration with tools like ServiceNow or SolarWinds Regulatory reporting aligned with OT cybersecurity mandates Asset fingerprinting, normalization, and lifecycle mapping Proficient with discovery platforms such as TXOne Element, Claroty CTD, and ServiceNow CMDB. Experience in passive and active asset fingerprinting across diverse OT protocols. Skilled in tagging assets by Purdue level, criticality, and operational function. Supports asset reconciliation workflows, enrichment audits, and lifecycle visibility. Applies IEC 62443-1-2 and 2-1 practices for asset governance and monitoring.

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Requirements: • Minimum 2 years of experience • Must have minimum 2 years’ experience working on .NET core and knowledge in LINQ • Strong experience in Database Design, RDBMS and SQL • Having Experience in Windows application Development should be additional advantages • Expert in MS SQL relational database design, normalization, usage SPs, Triggers and Views • Configure MS SQL Server and deploy Database • Well versed in JavaScript and jQuery • Demonstrated experience in the support of Microsoft SQL Server 2008 and 2012 • Prior experience Windows & Web Based Applications Responsibilities: 1. Application Development & Maintenance • Develop, maintain, and optimize both Windows and web-based applications using .NET Core, C#, and LINQ. • Ensure high performance, scalability, and security of applications. 2. Database Management & Optimization • Design and manage relational databases using MS SQL Server. • Create and optimize stored procedures, views, triggers, and ensure normalization and efficient query performance. 3. System Configuration & Deployment • Configure and deploy SQL Server databases across environments. • Manage version control and deployment processes for applications and databases. 4. Frontend Integration & Scripting • Implement frontend functionality using JavaScript and jQuery. • Collaborate with UI/UX teams to ensure seamless integration of user-facing elements with backend logic

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

Lead Data Engineer (Remote for India only) Strong hands-on expertise in SQL, DBT and Python for data processing and transformation. Expertise in Azure data services (e.g., Azure Data Factory, Synapse, Event Hub) and orchestration tools. Strong experience with Snowflake – including schema design, performance tuning, and security model. Good understanding of DBT for transformation layer and modular pipeline design. Hands-on with Git and version control practices – branching, pull requests, code reviews. Understanding of DevOps/DataOps principles – CI/CD for data pipelines, testing, monitoring. Knowledge of data modeling techniques – Star schema, Data Vault, Normalization/Denormalization. Experience with real-time data processing architectures is a strong plus. Proven leadership experience – should be able to mentor team members, take ownership, make design decisions independently. Strong sense of ownership, accountability, and solution-oriented mindset. Ability to handle ambiguity and work independently with minimal supervision. Clear and confident communication (written and verbal) – must be able to represent design and architecture decisions. Lead the design and development of data pipelines (batch and real-time) using modern cloud-native technologies (Azure, Snowflake, DBT, Python). Translate business and data requirements into scalable data integration designs. Guide and review development work across data engineering team members (onshore and offshore). Define and enforce best practices for coding, testing, version control, CI/CD, data quality, and pipeline monitoring. Collaborate with data analysts, architects, and business stakeholders to ensure data solutions are aligned with business goals. Own and drive end-to-end data engineering workstreams – from design to production deployment and support. Provide architectural and technical guidance on platform setup, performance tuning, cost optimization, and data security. Drive data engineering standards and reusable patterns across projects to ensure scalability, maintainability, and reusability of code and data assets. Define and oversee data quality frameworks to proactively detect, report, and resolve data issues across ingestion, transformation, and consumption layers. Act as a technical go-to team member for complex design, performance, or integration issues across multiple teams and tools (e.g., DBT + Snowflake + Azure pipelines). Contribute to hand on development as well for the ned to end integration pipelines and workflows. Document using Excel, Word, or tools like Confluence.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies