Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At Squareshift, we are looking for a skilled Consultant with 3 to 6 years of experience in Elasticsearch and Observability Solutions. The ideal candidate will be responsible for implementing, configuring, and optimizing Elasticsearch-based solutions, ensuring efficient log collection, data migration, and monitoring services. This role requires hands-on expertise in Elasticsearch Stack (Elasticsearch, Logstash, Kibana, Beats)along with experience in cloud platforms (AWS, GCP, or Azure) and Kubernetes. The consultant will work closely with clients to implement observability solutions, develop custom pipelines, and ensure seamless integration with third-party applications. Responsibilities Deploy, configure, and optimize Elasticsearch, Logstash, Kibana, and Beats. Implement ILM policies, manage data retention, and configure security settings. Performs snapshot and restore for data migration across clusters. Deploy and configure log collection agents on multiple machines. Design and implement custom pipelines for log processing and transformation. Normalize log data and apply ECS mapping for structured observability. Configure Heartbeat, APM, and RUM for system uptime and performance monitoring. Set up synthetic monitoring and define alerting mechanisms. Guide client teams in integrating Java APM agents and other observability tools. Implement role-based access control (RBAC)and secure clusters. Ensure compliance with best practices for Elasticsearch security. Work with clients to integrate Elastic Security for threat detection and response. Collaborate with client teams to gather requirements and provide solutions. Deliver documentation and training sessions to ensure knowledge transfer. Support troubleshooting and performance tuning for Elasticsearch deployments. Requirements 3 to 6 years of experience in Elasticsearch, Observability, or Cloud Data Engineering. Excellent communication skills are mandatory for effective client interactions and solution presentations. Strong expertise in Elasticsearch, Logstash, Kibana, and Beats. Experience in Elastic Security, ILM, and data migration strategies. Proficiency in cloud platforms(AWS, GCP, or Azure) and Kubernetes. Hands-on experience in log collection, normalization, and custom pipeline development. Strong knowledge of Linux environments, scripting, and automation. This job was posted by Helora Padmini from Squareshift.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary The Sr. Infra Developer will be responsible for designing implementing and maintaining infrastructure solutions using BMC Helix technologies. This role requires a deep understanding of BMC Helix Remedy-SmartReporting DWP & SmartIT Dashboards CMDB and Remedy. The candidate will work in a hybrid model with rotational shifts ensuring seamless infrastructure operations and contributing to the companys technological advancements. Responsibilities Design develop and maintain integrations from sources of record/tools to BMC CMDB Design and develop Remedy workflow and code. Design develop and maintain CMDB reconciliation and normalization logic. Integrate CMDB with multiple systems using Atrium Spoon and Atrium Integrator. CMDB admin activities including Data load Normalization Reconciliation Precedence rules configuration etc. Knowledge of CMDB data dependencies with ITSM modules like Change Management and Incident Management etc. Design and Develop CMDB reports using Smart Reporting/BMC Helix Dashboards. Good to have third party reporting tool knowledge e.g Power BI. Documentation on CMDB development integration. CMDB enhancements and Developments on BMC SaaS Helix Platform. Configuration and Administration of OOTB BMC Helix CMDB Effective communication Drive and lead designs and development recommendations to standardize improve or redesign processes. Recommend and implement changes to existing data model that will mature use of CMDB. Certifications Required BMC Certified Professional in Helix Remedy or equivalent certification.
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Cochin
Remote
We are seeking a knowledgeable and experienced Microsoft SQL Trainer to deliver comprehensive training programs on Microsoft SQL Server. The trainer will be responsible for equipping learners with the skills to query, manage, and administer SQL databases efficiently. The role involves curriculum development, hands-on demonstrations, and guiding learners through real-time projects. Key Responsibilities: Design and deliver training sessions on Microsoft SQL Server (T-SQL, DDL/DML, stored procedures, indexing, performance tuning, etc.) Teach database fundamentals, SQL query writing, database design, and optimization techniques. Develop training materials, manuals, exercises, and assessments. Conduct workshops for beginners to advanced-level learners. Provide hands-on experience through projects and lab sessions. Evaluate student performance and provide constructive feedback. Stay updated with the latest updates and versions of SQL Server and incorporate them into training. Customize training programs to suit various industry or domain needs. Mentor students for certification exams (such as Microsoft Certified: Azure Data Fundamentals / Database Administrator Associate). Required Skills & Qualifications: Strong experience in Microsoft SQL Server and T-SQL (minimum 3–5 years preferred). Proficient in database development, data modeling, optimization, and administration. Experience in tools like SSMS, Azure Data Studio, and SQL Profiler. Good understanding of relational database concepts and normalization. Prior experience in teaching/training is highly desirable. Excellent communication and presentation skills. Ability to explain complex concepts in a simple and engaging manner. Preferred Qualifications: Microsoft certifications (e.g., MCSA: SQL Server, Azure Data Engineer, etc.). Exposure to cloud-based SQL solutions like Azure SQL Database. Knowledge of Power BI or integration with reporting tools is a plus. Work Environment: Flexible work hours (for remote/part-time roles) Interactive classroom or online training sessions Continuous learning and upskilling environment. Job Types: Part-time, Freelance Schedule: Day shift Evening shift Fixed shift Monday to Friday Morning shift Night shift Rotational shift Weekend availability Work Location: In person
Posted 1 month ago
5.0 years
0 Lacs
Delhi
On-site
The Role Context: This is an exciting opportunity to join a dynamic and growing organization, working at the forefront of technology trends and developments in social impact sector. Wadhwani Center for Government Digital Transformation (WGDT) works with the government ministries and state departments in India with a mission of “ Enabling digital transformation to enhance the impact of government policy, initiatives and programs ”. We are seeking a highly motivated and detail-oriented individual to join our team as a Data Engineer with experience in the designing, constructing, and maintaining the architecture and infrastructure necessary for data generation, storage and processing and contribute to the successful implementation of digital government policies and programs. You will play a key role in developing, robust, scalable, and efficient systems to manage large volumes of data, make it accessible for analysis and decision-making and driving innovation & optimizing operations across various government ministries and state departments in India. Key Responsibilities: a. Data Architecture Design : Design, develop, and maintain scalable data pipelines and infrastructure for ingesting, processing, storing, and analyzing large volumes of data efficiently. This involves understanding business requirements and translating them into technical solutions. b. Data Integration: Integrate data from various sources such as databases, APIs, streaming platforms, and third-party systems. Should ensure the data is collected reliably and efficiently, maintaining data quality and integrity throughout the process as per the Ministries/government data standards. c. Data Modeling: Design and implement data models to organize and structure data for efficient storage and retrieval. They use techniques such as dimensional modeling, normalization, and denormalization depending on the specific requirements of the project. d. Data Pipeline Development/ ETL (Extract, Transform, Load): Develop data pipeline/ETL processes to extract data from source systems, transform it into the desired format, and load it into the target data systems. This involves writing scripts or using ETL tools or building data pipelines to automate the process and ensure data accuracy and consistency. e. Data Quality and Governance: Implement data quality checks and data governance policies to ensure data accuracy, consistency, and compliance with regulations. Should be able to design and track data lineage, data stewardship, metadata management, building business glossary etc. f. Data lakes or Warehousing: Design and maintain data lakes and data warehouse to store and manage structured data from relational databases, semi-structured data like JSON or XML, and unstructured data such as text documents, images, and videos at any scale. Should be able to integrate with big data processing frameworks such as Apache Hadoop, Apache Spark, and Apache Flink, as well as with machine learning and data visualization tools. g. Data Security : Implement security practices, technologies, and policies designed to protect data from unauthorized access, alteration, or destruction throughout its lifecycle. It should include data access, encryption, data masking and anonymization, data loss prevention, compliance, and regulatory requirements such as DPDP, GDPR, etc. h. Database Management: Administer and optimize databases, both relational and NoSQL, to manage large volumes of data effectively. i. Data Migration: Plan and execute data migration projects to transfer data between systems while ensuring data consistency and minimal downtime. a. Performance Optimization : Optimize data pipelines and queries for performance and scalability. Identify and resolve bottlenecks, tune database configurations, and implement caching and indexing strategies to improve data processing speed and efficiency. b. Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide them with access to the necessary data resources. They also work closely with IT operations teams to deploy and maintain data infrastructure in production environments. c. Documentation and Reporting: Document their work including data models, data pipelines/ETL processes, and system configurations. Create documentation and provide training to other team members to ensure the sustainability and maintainability of data systems. d. Continuous Learning: Stay updated with the latest technologies and trends in data engineering and related fields. Should participate in training programs, attend conferences, and engage with the data engineering community to enhance their skills and knowledge. Desired Skills/ Competencies Education: A Bachelor's or Master's degree in Computer Science, Software Engineering, Data Science, or equivalent with at least 5 years of experience. Database Management: Strong expertise in working with databases, such as SQL databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Big Data Technologies: Familiarity with big data technologies, such as Apache Hadoop, Spark, and related ecosystem components, for processing and analyzing large-scale datasets. ETL Tools: Experience with ETL tools (e.g., Apache NiFi, Talend, Apache Airflow, Talend Open Studio, Pentaho, Infosphere) for designing and orchestrating data workflows. Data Modeling and Warehousing: Knowledge of data modeling techniques and experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Data Governance and Security: Understanding of data governance principles and best practices for ensuring data quality and security. Cloud Computing: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services for scalable and cost-effective data storage and processing. Streaming Data Processing: Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Apache Flink) for handling streaming data. KPIs: Data Pipeline Efficiency: Measure the efficiency of data pipelines in terms of data processing time, throughput, and resource utilization. KPIs could include average time to process data, data ingestion rates, and pipeline latency. Data Quality Metrics: Track data quality metrics such as completeness, accuracy, consistency, and timeliness of data. KPIs could include data error rates, missing values, data duplication rates, and data validation failures. System Uptime and Availability: Monitor the uptime and availability of data infrastructure, including databases, data warehouses, and data processing systems. KPIs could include system uptime percentage, mean time between failures (MTBF), and mean time to repair (MTTR). Data Storage Efficiency: Measure the efficiency of data storage systems in terms of storage utilization, data compression rates, and data retention policies. KPIs could include storage utilization rates, data compression ratios, and data storage costs per unit. Data Security and Compliance: Track adherence to data security policies and regulatory compliance requirements such as DPDP, GDPR, HIPAA, or PCI DSS. KPIs could include security incident rates, data access permissions, and compliance audit findings. Data Processing Performance: Monitor the performance of data processing tasks such as ETL (Extract, Transform, Load) processes, data transformations, and data aggregations. KPIs could include data processing time, CPU usage, and memory consumption. Scalability and Performance Tuning: Measure the scalability and performance of data systems under varying workloads and data volumes. KPIs could include scalability benchmarks, system response times under load, and performance improvements achieved through tuning. Resource Utilization and Cost Optimization: Track resource utilization and costs associated with data infrastructure, including compute resources, storage, and network bandwidth. KPIs could include cost per data unit processed, cost per query, and cost savings achieved through optimization. Incident Response and Resolution: Monitor the response time and resolution time for data-related incidents and issues. KPIs could include incident response time, time to diagnose and resolve issues, and customer satisfaction ratings for support services. Documentation and Knowledge Sharing : Measure the quality and completeness of documentation for data infrastructure, data pipelines, and data processes. KPIs could include documentation coverage, documentation update frequency, and knowledge sharing activities such as internal training sessions or knowledge base contributions. Years of experience of the current role holder New Position Ideal years of experience 3 – 5 years Career progression for this role CTO WGDT (Head of Incubation Centre) ******************************************************************************* Wadhwani Corporate Profile: (Click on this link) Our Culture: WF is a global not-for-profit, and works like a start-up, in a fast-moving, dynamic pace where change is the only constant and flexibility is the key to success. Three mantras that we practice across job roles, levels, functions, programs and initiatives, are Quality, Speed, Scale, in that order. We are an ambitious and inclusive organization, where everyone is encouraged to contribute and ideate. We are intensely and insanely focused on driving excellence in everything we do. We want individuals with the drive for excellence, and passion to do whatever it takes to deliver world class outcomes to our beneficiaries. We set our own standards often more rigorous than what our beneficiaries demand, and we want individuals who love it this way. We have a creative and highly energetic environment – one in which we look to each other to innovate new solutions not only for our beneficiaries but for ourselves too. Open to collaborate with a borderless mentality, often going beyond the hierarchy and siloed definitions of functional KRAs, are the individuals who will thrive in our environment. This is a workplace where expertise is shared with colleagues around the globe. Individuals uncomfortable with change, constant innovation, and short learning cycles and those looking for stability and orderly working days may not find WF to be the right place for them. Finally, we want individuals who want to do greater good for the society leveraging their area of expertise, skills and experience. The foundation is an equal opportunity firm with no bias towards gender, race, colour, ethnicity, country, language, age and any other dimension that comes in the way of progress. Join us and be a part of us! Bachelors in Technology / Masters in Technology
Posted 1 month ago
0 years
0 Lacs
Bengaluru
On-site
Ready to build the future with AI? At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Principal Consultant -Sr. Performance DBA Teradata! Responsibilities Strong expertise in writing and optimizing Teradata SQL queries, TPT script etc. Manage Production/Development databases performance. Review Teradata system reports and provide “performance assessment” report with recommendations to optimize system. Investigate and quantify opportunities from “performance assessment” reports and Apply best practices in each of the areas. Monitor using Viewpoint tool for Teradata system performance using different portlets. Review poor performing queries generated from BI/ETL tools and provide best practice recommendations on how to simplify and restructure views, apply PPI or other index changes Closely monitor the performance of various work groups on the system and make sure data is available to business as per the SLA requirement. Optimal index analysis - Review Index usage on tables and recommend adding dropping indexes for optimal data access. Review uncompressed tables, analyse its usage, and implement compression to save space and reduce IO activity – using various algorithms like MVC/ BLC/ ALC. Optimize locking statements in views, macros & queries to eliminate blocking contention invest. Review the Spool Limits for the users and recommend optimal limit for the Ad-hoc users to avoid run away queries over consuming system resources. Check for Mismatch data types in the system and make them unique to avoid costly translations during query processing. Review Set tables and check for the options to convert to MultiSet to avoid costly duplicate row checking operation. Review Large Scan Tables on the system and analyze for using PPI, MLPPI, Compression, Secondary indexes & Join Indexes Analyze various applications and understand the space requirements and segregate the disk space under the categories of perm, spool, and temp space. Setting up the database hierarchy that includes database creation and management of objects such as users, Roles, Profiles, tables, views. Maintain profiles, roles, access rights and permissions for Teradata user groups and objects. Generate periodic performance reports using PDCR and identify bottlenecks with the system performance. Establish PDCR canary performance baselines. Utilize standard canary queries to identify variance from baseline. Effective usage of TASM & Priority distribution to penalize the resource intensive queries, Give high priority to business-critical workloads, Throttling of different workloads for optimal throughput and provide performance reports to check workload management health. Qualifications we seek in you! Minimum qualifications Teradata Performance DBA experience. Experience in review of poor performing queries and provide best practice recommendations on how to simplify and restructure views, apply PPI or other index changes. Statistics Management and Optimization Exposure to DWH Env (Knowledge of ETL/DI/BI Reporting). Exposure to troubleshoot the TPT/ FastLoad / Multiload/ FastExport/ BTEQ/ TPump errors, should be good at error handling. Experience in fine tuning various application parameters/number of sessions to ensure optimal functioning of the application. Well conversant with various ticketing system/production change request/ Teradata Incident management. Should be good at automating various processes. Ability to write efficient SQL & exposure to query tuning. Preferably understand Normalization and De-normalization concepts. Preferable exposure to visualization tools like Tableau, PowerBI. Preferably have good working knowledge on UNIX shell, Python scripting. Good to have exposure to FSLDM Good to have exposure to GCFR framework. Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 27, 2025, 11:07:39 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 month ago
55.0 years
0 Lacs
Noida
Remote
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you ’ ll be able to reimagine what ’ s possible. Join us and help the world ’ s leading organizations unlock the value of technology and build a more sustainable, more inclusive world . Your Role Lead the design, implementation, and optimization of robust PostgreSQL database solutions for global clients. Leverage deep expertise in PostgreSQL to architect scalable, secure, and high-performing database systems for business-critical applications. Design and implement end-to-end PostgreSQL database architectures tailored to specific client requirements around performance, availability, and data management. Collaborate with cross-functional teams to integrate PostgreSQL with diverse technology stacks such as Python, Java, Node.js, and cloud platforms (AWS, Azure, GCP). Conduct technical workshops, gather database-related requirements, and translate business needs into effective database solutions. Optimize database performance through advanced SQL tuning, indexing strategies, and efficient query design. Ensure data integrity, implement robust backup and recovery procedures, and configure high availability and failover solutions. Provide mentorship and technical guidance to junior DBAs and developers. Promote adherence to industry best practices, security standards, and quality guidelines in database Your Profile Proven experience in PostgreSQL database administration, design, and performance tuning. Strong understanding of relational database design, normalization, indexing, and query optimization. Hands-on experience with advanced SQL, stored procedures, triggers, and functions. Experience in integrating PostgreSQL with cloud platforms and application environments. Ability to lead client engagements, provide technical guidance, and manage project deliverables. Familiarity with PostgreSQL extensions (e.g., PostGIS, TimescaleDB) and replication strategies. Strong communication and organizational skills, with a collaborative and proactive mindset What you’ll love about working with us We value flexibility and support a healthy work-life balance through remote and hybrid work options. Competitive compensation and benefits. Career development programs and certifications in SAP and cloud technologies. A diverse and inclusive workplace that fosters innovation and collaboration. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 1 month ago
2.0 years
3 - 4 Lacs
Noida
On-site
Position: Web Developer We are looking for a highly skilled Web Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹25,000.00 - ₹35,000.00 per month Application Question(s): What's your Current CTC ? What's your expected ctc? Experience: Core PHP: 2 years (Preferred) Laravel: 2 years (Preferred) WordPress: 2 years (Preferred) Location: Noida, Uttar Pradesh (Required) Work Location: In person
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Business Analyst Project Role Description : Analyze an organization and design its processes and systems, assessing the business model and its integration with technology. Assess current state, identify customer requirements, and define the future state and/or business solution. Research, gather and synthesize information. Must have skills : Core Banking Good to have skills : Business Requirements Analysis Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Business Analyst, you will analyze an organization and design its processes and systems, assessing the business model and its integration with technology. You will assess the current state, identify customer requirements, and define the future state and/or business solution. Research, gather, and synthesize information. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop business strategies and provide recommendations for process improvements. - Collaborate with stakeholders to gather and analyze business requirements. - Create detailed documentation of business processes and system requirements. - Conduct gap analysis and propose solutions to enhance business operations. - Assist in the implementation and testing of new systems or processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Core Banking. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Core Banking. - This position is based at our Pune office. - A 15 years full-time education is required.
Posted 1 month ago
5.0 years
0 Lacs
Chhattisgarh, India
On-site
Job Description About Sarvang Sarvang Infotech India Limited, established in 2005, is a trusted IT solutions provider catering to clients across India, Africa, and the UAE. We specialize in delivering enterprise-grade technology solutions for industries such as Metal, Mining, Power, and other large corporates. Driven by our belief in innovation, we deliver scalable and mission-critical systems that power digital transformation. Role Overview We are looking for an experienced and results-driven **Senior SQL Server Developer** to join our technical team. This role involves designing, developing, and optimizing complex database systems, procedures, and reports in support of software applications. You will be a key contributor to database architecture, performance tuning, and data integrity in our enterprise projects. Job Location: Korba, Chhattisgarh. Bilaspur, Chhattisgarh. Raipur, Chhattisgarh. Durg, Chhattisgarh. Bhilai, Chhattisgarh. Key Responsibilities Design, develop, and maintain SQL Server databases, stored procedures, triggers, and views. Work closely with application developers to optimize queries and database performance. Ensure database security, integrity, stability, and system availability. Implement and optimize database backup, recovery, and archiving strategies. Monitor and troubleshoot database performance issues using tools like SQL Profiler, Execution Plans, etc. Design data models and perform database normalization for new and existing applications. Assist in migration, upgrades, and patches of SQL Server environments. Support business reporting needs through data extraction and complex SQL queries. Collaborate with cross-functional teams to gather requirements and deliver database solutions. Document database structures, processes, and best practices. Why Join Sarvang? Opportunity to work on mission-critical systems for top corporates in Metal, Mining, and Power sectors. Dynamic and innovation-driven work culture. Supportive leadership and professional development opportunities. Competitive compensation and growth-focused career path. Apply Now If you are passionate about databases, performance optimization, and working on real enterprise challenges, Sarvang is the right place for you. Job Requirement Bachelor’s degree in Computer Science, Information Technology, or related field. Minimum 5 years of hands-on experience in SQL Server database development and administration. Strong expertise in T-SQL, stored procedures, functions, triggers, and query optimization. Proficient in database design, normalization, indexing strategies, and performance tuning. Experience in SSIS, SSRS, or other Microsoft BI tools is a plus. Solid understanding of database security and user role management. Strong problem-solving skills and ability to handle production issues under pressure. Experience working with large datasets in enterprise-level applications. Excellent communication and documentation skills.
Posted 1 month ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
About Company Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. We support flexible engagement models including Time & Material, Staff Augmentation, and SoW-based Managed Services. About The Role We’re hiring a Senior Machine Learning Developer to join our data modernization and AI engineering team. In this role, you will design, develop, and deploy ML models and intelligent systems that drive next-gen financial insights. You will work with structured and unstructured data on a secure cloud-based infrastructure leveraging Azure ML , Python , and modern NLP/AI frameworks . Key Responsibilities Build, train, and implement ML/NLP models for classification, clustering, and text analysis Work with large-scale financial data and develop intelligent automation solutions Clean, normalize, and validate structured and unstructured datasets Build and consume REST APIs for ML services Integrate Azure AI services including Cognitive Services, OpenAI, and Form Recognizer Collaborate with cross-functional teams including DevOps, data engineers, and PMs Follow best practices in model versioning, testing, and deployment Document ML workflows and models for reproducibility and compliance Required Skills & Experience 6+ years of experience in software/data engineering, with 3+ years in ML/AI Strong Python expertise with libraries like Scikit-learn, Pandas, NumPy, etc. Experience with ML frameworks such as TensorFlow or PyTorch Experience building AI models for NLP tasks like classification, summarization, or entity extraction Familiarity with Azure AI/ML services (Azure ML Studio, Cognitive Services, AKS, Key Vault) Experience handling both structured and unstructured datasets REST API and Python library development Excellent communication and documentation skills Nice To Have Exposure to finance sector datasets or financial document automation Experience with Azure OpenAI, Azure Language Studio, or Chatbot frameworks Familiarity with statistical programming languages like R or Julia Microsoft Data or Azure AI/ML certifications Benefits And Perks Opportunity to work with leading global clients Flexible work arrangements with remote options Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities Skills: machine learning,natural language processing (nlp),numpy,azure ml,data normalization,azure cognitive services,financial data,azure key vault,azure ml studio,rest apis,unstructured data,api development,openai,data cleansing,rest api,python,scikit-learn,aks,tensorflow,nlp,pandas,data engineering,azure ai,pytorch,statistical programming,text classification
Posted 1 month ago
0.0 - 2.0 years
0 - 0 Lacs
Noida, Uttar Pradesh
On-site
Position: Web Developer We are looking for a highly skilled Web Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹25,000.00 - ₹35,000.00 per month Application Question(s): What's your Current CTC ? What's your expected ctc? Experience: Core PHP: 2 years (Preferred) Laravel: 2 years (Preferred) WordPress: 2 years (Preferred) Location: Noida, Uttar Pradesh (Required) Work Location: In person
Posted 1 month ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Tableau BI Developer Location: Hyderabad (100% On Site) Timings: 2 PM to 11 PM IST Job Summary: We are looking for an experienced Tableau BI Developer to join our data and analytics team. The ideal candidate will have 5–6 years of hands-on experience in developing interactive dashboards and reports using Tableau, translating business needs into actionable insights, and ensuring data accuracy and integrity across various sources. Key Responsibilities: Design, develop, and maintain interactive Tableau dashboards and visualizations based on business requirements. Collaborate with business stakeholders, analysts, and data engineers to gather and understand reporting needs. Translate complex datasets into accessible insights to drive business decisions. Optimize and fine-tune Tableau dashboards for performance and usability. Perform data analysis and validation to ensure data accuracy and consistency. Develop and maintain documentation of dashboards, data sources, and business logic. Work with SQL and data warehouses to prepare datasets and write complex queries for reporting purposes. Ensure best practices in data visualization, UX, and storytelling. Provide user training and support on Tableau dashboards and usage. Required Skills & Qualifications: 5–6 years of professional experience in Tableau development. Strong expertise in Tableau Desktop and Tableau Server/Online. Proficient in writing complex SQL queries for data extraction and transformation. Experience working with large datasets from relational databases like SQL Server, Oracle, Snowflake, or Redshift. Familiarity with ETL concepts and data warehousing principles. Strong understanding of data modeling, normalization, and performance tuning. Excellent problem-solving skills and attention to detail. Strong communication and stakeholder management skills. Nice-to-Have: Experience with other BI tools (e.g., Power BI, QlikView). Exposure to cloud platforms (e.g., AWS, Azure, GCP). Knowledge of Python or R for data analysis and automation.
Posted 1 month ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Tableau BI Developer Experience: 5–6 Years Location: Hyderabad, India Employment Type: Full Time Or Contract Timings: 2 PM to 11 PM IST Job Summary: We are looking for an experienced Tableau BI Developer to join our data and analytics team. The ideal candidate will have 5–6 years of hands-on experience in developing interactive dashboards and reports using Tableau, translating business needs into actionable insights, and ensuring data accuracy and integrity across various sources. Key Responsibilities: Design, develop, and maintain interactive Tableau dashboards and visualizations based on business requirements. Collaborate with business stakeholders, analysts, and data engineers to gather and understand reporting needs. Translate complex datasets into accessible insights to drive business decisions. Optimize and fine-tune Tableau dashboards for performance and usability. Perform data analysis and validation to ensure data accuracy and consistency. Develop and maintain documentation of dashboards, data sources, and business logic. Work with SQL and data warehouses to prepare datasets and write complex queries for reporting purposes. Ensure best practices in data visualization, UX, and storytelling. Provide user training and support on Tableau dashboards and usage. Required Skills & Qualifications: 5–6 years of professional experience in Tableau development. Strong expertise in Tableau Desktop and Tableau Server/Online. Proficient in writing complex SQL queries for data extraction and transformation. Experience working with large datasets from relational databases like SQL Server, Oracle, Snowflake, or Redshift. Familiarity with ETL concepts and data warehousing principles. Strong understanding of data modeling, normalization, and performance tuning. Excellent problem-solving skills and attention to detail. Strong communication and stakeholder management skills. Nice-to-Have: Experience with other BI tools (e.g., Power BI, QlikView). Exposure to cloud platforms (e.g., AWS, Azure, GCP). Knowledge of Python or R for data analysis and automation. Tableau certification(s) is a plus.
Posted 1 month ago
55.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world . Your Role Lead the design, implementation, and optimization of robust PostgreSQL database solutions for global clients. Leverage deep expertise in PostgreSQL to architect scalable, secure, and high-performing database systems for business-critical applications. Design and implement end-to-end PostgreSQL database architectures tailored to specific client requirements around performance, availability, and data management. Collaborate with cross-functional teams to integrate PostgreSQL with diverse technology stacks such as Python, Java, Node.js, and cloud platforms (AWS, Azure, GCP). Conduct technical workshops, gather database-related requirements, and translate business needs into effective database solutions. Optimize database performance through advanced SQL tuning, indexing strategies, and efficient query design. Ensure data integrity, implement robust backup and recovery procedures, and configure high availability and failover solutions. Provide mentorship and technical guidance to junior DBAs and developers. Promote adherence to industry best practices, security standards, and quality guidelines in database Your Profile Proven experience in PostgreSQL database administration, design, and performance tuning. Strong understanding of relational database design, normalization, indexing, and query optimization. Hands-on experience with advanced SQL, stored procedures, triggers, and functions. Experience in integrating PostgreSQL with cloud platforms and application environments. Ability to lead client engagements, provide technical guidance, and manage project deliverables. Familiarity with PostgreSQL extensions (e.g., PostGIS, TimescaleDB) and replication strategies. Strong communication and organizational skills, with a collaborative and proactive mindset What You’ll Love About Working With Us We value flexibility and support a healthy work-life balance through remote and hybrid work options. Competitive compensation and benefits. Career development programs and certifications in SAP and cloud technologies. A diverse and inclusive workplace that fosters innovation and collaboration. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description We are looking for a skilled Power BI and SQL Developer who can design and deliver robust, user-friendly dashboards and data solutions. The ideal candidate will have strong experience in Power BI development, SQL programming, and data modeling, with excellent communication and stakeholder management abilities. Key Responsibilities Design, develop, and maintain interactive Power BI dashboards and reports based on business requirements. Build efficient data models (star/snowflake schemas) and implement Row-Level Security (RLS) for secure reporting. Develop complex DAX expressions and calculations to support advanced analytics. Write and optimize complex SQL queries, stored procedures, and views to support data extraction and transformation. Collaborate with stakeholders to gather requirements, understand business needs, and translate them into effective data solutions. Work closely with cross-functional teams, providing regular updates and managing stakeholder expectations throughout the project lifecycle. Must-Have Skills - Power BI & SQL Development Proven experience in designing, developing, and maintaining Power BI dashboards and reports. Proficiency in creating optimized data models (star/snowflake schemas) and implementing Row-Level Security (RLS) for secure and scalable reporting. Strong command of advanced DAX expressions for complex data calculations and analytical insights. Expertise in writing and optimizing complex SQL queries, stored procedures, and database views. Solid understanding of query performance tuning, data normalization, and indexing strategies, along with monitoring and supporting Power BI reports. Excellent verbal and written communication skills. Proven ability to work with business stakeholders and cross-functional teams. Good-to-Have Skills - Azure Data Factory (ADF) & ETL Pipelines: Experience with Azure Data Factory (ADF) and building scalable ETL pipelines. Familiarity with Azure Synapse Analytics, Blob Storage, or other Azure data services. Ability to schedule and automate data refreshes and transformations in a cloud environment. Qualifications Bachelor’s degree in information technology, Computer Science, or related discipline. 4 to 6 years of hands-on experience in Power BI (including DAX, Power Query, and Data Modeling) and SQL development
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities JOB DESCRIPTION Develop, optimize, and maintain complex SQL queries, stored procedures, functions, and views. Analyze slow-performing queries and optimize execution plans to improve database performance. Design and implement indexing strategies to enhance query efficiency. Work with developers to optimize database interactions in applications. Develop and implement Teradata best practices for large-scale data processing and ETL workflows. Monitor and troubleshoot Teradata performance issues using tools like DBQL (Database Query Log), Viewpoint, and Explain Plan Analysis. Perform data modeling, normalization, and schema design improvements. Collaborate with teams to implement best practices for database tuning and performance enhancement. Automate repetitive database tasks using scripts and scheduled jobs. Document database architecture, queries, and optimization techniques. Responsibilities Required Skills & Qualifications: Strong proficiency in Teradata SQL, including query optimization techniques. Strong proficiency in SQL (T-SQL, PL/SQL, or equivalent). Experience with indexing strategies, partitioning, and caching techniques. Knowledge of database normalization, denormalization, and best practices. Familiarity with ETL processes, data warehousing, and large datasets. Experience in writing and optimizing stored procedures, triggers, and functions. Hands-on experience in Teradata performance tuning, indexing, partitioning, and statistics collection. Experience with EXPLAIN plans, DBQL analysis, and Teradata Viewpoint monitoring. Candidate should have PowerBI / Tableau integration experience - Good to Have About Us ABOUT US Bristlecone is the leading provider of AI-powered application transformation services for the connected supply chain. We empower our customers with speed, visibility, automation, and resiliency – to thrive on change. Our transformative solutions in Digital Logistics, Cognitive Manufacturing, Autonomous Planning, Smart Procurement and Digitalization are positioned around key industry pillars and delivered through a comprehensive portfolio of services spanning digital strategy, design and build, and implementation across a range of technology platforms. Bristlecone is ranked among the top ten leaders in supply chain services by Gartner. We are headquartered in San Jose, California, with locations across North America, Europe and Asia, and over 2,500 consultants. Bristlecone is part of the $19.4 billion Mahindra Group. Equal Opportunity Employer Bristlecone is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status . Information Security Responsibilities Understand and adhere to Information Security policies, guidelines and procedure, practice them for protection of organizational data and Information System. Take part in information security training and act while handling information. Report all suspected security and policy breach to InfoSec team or appropriate authority (CISO). Understand and adhere to the additional information security responsibilities as part of the assigned job role.
Posted 1 month ago
1.0 years
7 - 8 Lacs
Hyderābād
On-site
Greetings from Star Secutech Pvt Ltd!!! Huge welcome to Immediate Joiners!!! Job Title: SR Executive Reporting to: Team Leader/AM/DM Location: Hyderabad Working Hours/ Days: 9 Hours / 5 Days a Week. Shift: U.S Shift (5:30 PM – 2:30AM) Salary: 7-8 LPA (Negotiable) Job Role: Data types: Identify the data types of each data set and ensure compatibility. Harmonization process: Develop a harmonization process that outlines the steps required to harmonize data, such as data cleansing, normalization, and validation. Disparate data sources: Consider data sources that may have different formats, such as databases, spreadsheets, and APIs. Develop methods to integrate and harmonize data from various sources. Harmonization tools: Utilize various tools and technologies, such as extract, transform, load (ETL) tools, data integration platforms, and data cleansing software, to streamline the harmonization process. Harmonization schema: Define a harmonization schema that standardizes the data structure, format, and terminology across different data sets. Interested candidates don't wait call or DM to 9087726632 to proceed further with interview & start working!!!! All the best!! Job Types: Full-time, Permanent Pay: ₹700,000.00 - ₹800,000.00 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Evening shift Fixed shift Monday to Friday Night shift UK shift US shift Supplemental Pay: Performance bonus Shift allowance Yearly bonus Education: Bachelor's (Required) Experience: Pharmacobigilence: 1 year (Required) Location: Hyderabad, Telangana (Required) Shift availability: Night Shift (Required) Work Location: In person Application Deadline: 29/07/2025 Expected Start Date: 07/07/2025
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
8+ years of experience in data engineering or a related field. · Strong expertise in Snowflake including schema design, performance tuning, and security. · Proficiency in Python for data manipulation and automation. · Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). · Experience with DBT for data transformation and documentation. · Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). · Strong SQL skills and experience with large-scale data sets. · Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.
Posted 1 month ago
0.0 - 2.0 years
0 Lacs
Bengaluru, Karnataka
On-site
We’re Hiring | PostgreSQL Developer (2-4 Years Experience) Location: Jayanagar, Bangalore Company: Agile Labs Work Type: Full-Time | Work from Office Experience: 2 to 4 Years About Us: At Agile Labs, we build innovative software solutions using our proprietary low-code, no-code platform. We serve a wide range of industries by helping them digitize and streamline operations quickly and efficiently. We are passionate about clean code, scalable architecture, and performance-driven development. Role Overview: We are looking for a skilled and detail-oriented PostgreSQL Developer to join our growing team. The ideal candidate will have hands-on experience in designing, optimizing, and maintaining PostgreSQL databases in a fast-paced application environment. Key Responsibilities: Design, implement, and optimize complex PostgreSQL database queries and stored procedures. Analyze existing SQL queries for performance improvements and suggest optimizations. Develop and maintain database structures to support evolving application and business requirements. Ensure data integrity and consistency across the platform. Work closely with application developers to understand data needs and deliver effective database solutions. Perform database tuning, indexing, and maintenance. Create, review, and optimize database scripts, views, and functions. Implement backup and recovery plans, data security protocols, and access control mechanisms. Required Skills & Experience: 2 to 4 years of hands-on experience in PostgreSQL development. Strong knowledge of SQL, PL/pgSQL, functions, triggers, and stored procedures. Experience in database design, normalization, and data modeling. Proficiency in writing efficient queries and performance tuning. Good understanding of indexing strategies and query execution plans. Experience in version control systems like Git. Familiarity with Linux-based environments and scripting is a plus. Strong problem-solving skills and ability to work independently or in a team. Nice to Have: Experience working in Agile development environments. Exposure to cloud platforms (AWS, Azure, or GCP) and DBaaS solutions. Understanding of NoSQL/other databases is a plus. Familiarity with data warehousing and analytics tools. Why Join Agile Labs? Opportunity to work on innovative, impactful software solutions. Collaborative and learning-driven culture. Flexible and transparent work environment. Be part of a company that is redefining how software is built. Job Types: Full-time, Permanent Pay: Up to ₹600,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Education: Bachelor's (Required) Experience: PostgreSQL: 2 years (Required) Location: Bengaluru, Karnataka (Required) Work Location: In person
Posted 1 month ago
15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Lead Engineer Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing – Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases – Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud – GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles – Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality – experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
10.0 years
0 Lacs
India
On-site
About Fresh Gravity: Founded in 2015, Fresh Gravity helps businesses make data-driven decisions. We are driven by data and its potential as an asset to drive business growth and efficiency. Our consultants are passionate innovators who solve clients' business problems by applying best-in-class data and analytics solutions. We provide a range of consulting and systems integration services and solutions to our clients in the areas of Data Management, Analytics and Machine Learning, and Artificial Intelligence. In the last 10 years, we have put together an exceptional team and have delivered 200+ projects for over 80 clients ranging from startups to several fortune 500 companies. We are on a mission to solve some of the most complex business problems for our clients using some of the most exciting new technologies, providing the best of learning opportunities for our team. We are focused and intentional about building a strong corporate culture in which individuals feel valued, supported, and cared for. We foster an environment where creativity thrives, paving the way for groundbreaking solutions and personal growth. Our open, collaborative, and empowering work culture is the main reason for our growth and success. To know more about our culture and employee benefits, visit out website https://www.freshgravity.com/employee-benefits/ . We promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. We are data driven. We are passionate. We are innovators. We are Fresh Gravity. Requirements What you'll do: Solid hands-on experience with Talend Open Studio for Data Integration, Talend Administration Centre, and Talend Data Quality ETL Process Design: Able to develop and design ETL jobs, ensuring they meet business requirements and follow best practices. Knowledge of SCD, normalization jobs Talend Configuration: Proficiency in Configuring Talend Studio, Job Server, and other Talend components. Data Mapping: Proficiency in creating and refining Talend mappings for data extraction, transformation, and loading. SQL: Possess Strong knowledge of SQL and experience. Able to develop complex SQL queries for data extraction and loading, especially when working with databases like Oracle, RedShift, Snowflake. Custom Scripting: Knowledge to implement custom Talend components using scripting languages like Python or Java. Shell scripting to automate tasks Reusable Joblets: Working knowledge to Design and create reusable joblets for various ETL tasks. ESB Integration and real-time data integration : Able to implement and manage integrations with ESB (Enterprise Service Bus) systems - Kafka/Azure Event Hub, including REST and SOAP web services. Desirable Skills and Experience: Experience with ETL/ELT, data transformation, data mapping, and data profiling Strong analytical and problem-solving skills Ability to work independently and as part of a team Ability to work with cross-functional teams to understand business requirements and design data Troubleshoot and resolve data integration issues in a timely manner Mentor junior team members and help them improve their Talend development skills Stay up to date with the latest Talend and data integration trends and technologies, integration solutions that meet those requirements Benefits In addition to a competitive package, we promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. In keeping with Fresh Gravity's challenger ethos, we have developed the 5Dimensions (5D) benefits program. This program recognizes the multiple dimensions within each of us and seek to provide opportunities for deep development across these dimensions. Enrich Myself; Enhance My Client; Build my Company, Nurture My Family; and Better Humanity.
Posted 1 month ago
3.0 years
0 Lacs
India
On-site
Job Title: Oracle Product Data Hub (PDH) Technical Consultant – Product Master Data Specialist Location: India Job Type: Full-Time Consultant Experience Level: Mid to Senior-Level Industry: ERP / Master Data Management / Manufacturing / Retail / Supply Chain Job Summary: We are seeking a skilled Oracle Product Data Hub (PDH) Technical Consultant with deep expertise in Product Master Data Management to support the end-to-end lifecycle of finished goods, raw materials, and pricing data in Oracle Fusion PDH. The ideal candidate will have hands-on experience in data cleansing, enrichment, transformation, validation, and mass data loading into Oracle Cloud PDH using best practices and tools such as FBDI, REST/SOAP APIs, and Data Import Templates . This role requires strong technical knowledge of Oracle PDH, a problem-solving mindset, and experience collaborating with functional teams and business users to ensure clean, standardized, and accurate product data is maintained across systems. Key Responsibilities: Lead technical efforts in product data onboarding , including finished goods , raw materials , and pricing structures into Oracle Fusion Product Data Hub. Perform data cleansing, de-duplication, normalization, and transformation activities using industry best practices and custom rulesets. Develop and execute data migration strategies using Oracle FBDI templates , Import Maps , REST/SOAP APIs , and spreadsheets . Create and maintain scripts or tools for mass upload, update, and validation of product data. Collaborate with business analysts, data stewards, and IT to define and implement product data governance, data quality rules, and workflows. Conduct data validation and reconciliation activities post-load, ensuring accuracy, completeness, and compliance with business rules. Troubleshoot and resolve technical issues related to PDH data imports, validations, and integrations. Support product hierarchy setup, item class configuration, attribute groups, catalogs, and data quality scorecards. Document technical specifications, data load procedures, and configuration guides. Required Skills and Experience: 3+ years of hands-on technical experience with Oracle Fusion Product Data Hub (PDH) . Proven experience in mass loading and maintaining product data, including finished goods , raw materials , and pricing . Strong experience with Oracle FBDI templates , REST/SOAP Web Services , and Excel-based data load tools . Proficiency in SQL and PL/SQL for data analysis and transformation. Solid understanding of Oracle Fusion Product Hub structures: Item Classes, Templates, Catalogs, Attributes, and Change Orders . Knowledge of item lifecycle management , global product definitions , and cross-functional data dependencies . Familiarity with Oracle SCM modules (Inventory, Costing, Pricing) is a plus. Experience in large-scale data migration, cleansing, and conversion projects. Excellent analytical, communication, and stakeholder engagement skills. Preferred Qualifications: Oracle Cloud Certification in Product Data Management or SCM . Experience with data governance frameworks or MDM tools . Exposure to tools like Oracle Integration Cloud (OIC) , OACS , or Informatica MDM . Experience in manufacturing, apparel, or retail industries preferred.
Posted 1 month ago
5.0 - 31.0 years
9 - 15 Lacs
Bengaluru/Bangalore
On-site
Job Title: NoSQL Database Administrator (DBA )Department: IT / Data Management Job Purpose: The NoSQL Database Administrator will be responsible for designing, deploying, securing, and optimizing NoSQL databases to ensure high availability, reliability, and scalability of mission-critical applications. The role involves close collaboration with developers, architects, and security teams, especially in compliance-driven environments such as UIDAI. Key Responsibilities: Collaborate with developers and solution architects to design and implement efficient and scalable NoSQL database schemas. Ensure database normalization, denormalization where appropriate, and implement indexing strategies to optimize performance. Evaluate and deploy replication architectures to support high availability and fault tolerance. Monitor and analyze database performance using tools like NoSQL Enterprise Monitor and custom monitoring scripts. Troubleshoot performance bottlenecks and optimize queries using query analysis, index tuning, and rewriting techniques. Fine-tune NoSQL server parameters, buffer pools, caches, and system configurations to improve throughput and minimize latency. Implement and manage Role-Based Access Control (RBAC), authentication, authorization, and auditing to maintain data integrity, confidentiality, and compliance. Act as a liaison with UIDAI-appointed GRCP and security audit agencies, ensuring all security audits are conducted timely, and provide the necessary documentation and artifacts to address risks and non-conformities. Participate in disaster recovery planning, backup management, and failover testing. Key Skills & Qualifications: Educational Qualifications: Bachelor’s or Master’s Degree in Computer Science, Information Technology, or a related field. Technical Skills: Proficiency in NoSQL databases such as MongoDB, Cassandra, Couchbase, DynamoDB, or similar. Strong knowledge of database schema design, data modeling, and performance optimization. Experience in setting up replication, sharding, clustering, and backup strategies. Familiarity with performance monitoring tools and writing custom scripts for health checks. Hands-on experience with database security, RBAC, encryption, and auditing mechanisms. Strong troubleshooting skills related to query optimization and server configurations. Compliance & Security: Experience with data privacy regulations and security standards, particularly in compliance-driven sectors like UIDAI. Ability to coordinate with government and regulatory security audit teams. Behavioral Skills: Excellent communication and stakeholder management. Strong analytical, problem-solving, and documentation skills. Proactive and detail-oriented with a focus on system reliability and security. Key Interfaces: Internal: Developers, Solution Architects, DevOps, Security Teams, Project Managers. External: UIDAI-appointed GRCP, third-party auditors, security audit agencies. Key Challenges: Maintaining optimal performance and uptime in a high-demand, compliance-driven environment. Ensuring security, scalability, and availability of large-scale NoSQL deployments. Keeping up with evolving data security standards and audit requirements.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
What You’ll Do Manage and maintain PostgreSQL databases in development, staging, and production environments. Write and optimize SQL queries, stored procedures, functions, and triggers to support application logic. Design, implement, and maintain logical and physical database schemas. Monitor database performance and implement performance tuning strategies. Ensure data integrity, security, and availability through regular maintenance and backups. Collaborate with application developers to understand requirements and provide efficient database solutions. Handle database migrations, versioning, and deployment as part of CI/CD pipelines. Perform regular database health checks, index analysis, and query optimization. Troubleshoot and resolve database issues, including slow queries, locking, and replication errors. What We Seek In You Proven experience as a PostgreSQL Database with hands-on SQL development experience. Strong knowledge of PL/SQL and writing efficient stored procedures and functions. Experience with database schema design, normalization, and data modeling. Solid understanding of PostgreSQL internals, indexing strategies, and performance tuning. Experience with backup and recovery tools, pg_dump, pg_restore, replication, and monitoring tools. Proficient in Linux/Unix command-line tools for database management. Familiar with version control systems (e.g., Git) and CI/CD practices. Life At Next At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks Of Working With Us Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!
Posted 1 month ago
6.0 years
25 - 30 Lacs
India
Remote
Job Title: Enterprise Data Modeler – Snowflake Specialist Location: [Remote ] Experience Required: 6+ Years Employment Type: [Contract] Start Date: Immediate Role Overview We are seeking a highly skilled Enterprise Data Modeler with deep expertise in Snowflake and modern data modeling techniques. This role requires end-to-end experience in designing scalable, robust, and production-ready data models that align with complex business requirements and support analytics, reporting, and KPI logic. The ideal candidate has practical, hands-on experience—not just theoretical knowledge—of enterprise-level data modeling in real client projects. Key Responsibilities Lead the design and development of conceptual, logical, and physical data models using Snowflake. Collaborate with business and technical stakeholders to understand KPIs, metrics, and data flows to drive appropriate data architecture. Create technically sound, scalable data models using best-practice design patterns (e.g., Dimensional, Data Vault, Normalized). Translate complex and ambiguous business problems into structured Snowflake-ready models, including schema, table, and column-level specs. Develop schema objects such as views, constraints, partitions, clustering, and leverage Snowflake features like Time Travel, Zero-Copy Cloning, etc. Support downstream Power BI and data mart readiness, ensuring models are optimized for semantic and reporting layers. Design for historical and incremental loading (e.g., SCDs, CDC, audit columns, soft deletes). Produce clear and concise documentation including data flow diagrams, ER diagrams, lineage maps, and model architecture visuals. Collaborate closely with data architects and engineers to ensure model fitment within the larger data warehouse architecture. Ensure models support governance, metadata frameworks, and comply with enterprise data standards. Required Skills And Qualifications 6+ years of enterprise-level experience in data modeling across client-facing or production projects. Deep hands-on expertise with Snowflake SQL and schema design, including performance optimization. Strong understanding of data warehousing concepts, including: Dimensions, Facts, Surrogate Keys Star vs Snowflake Schema Normalization, Fact Grains, SCD Types ELT vs ETL, Semantic Layers, Data Vault Proficiency in using tools like Lucidchart, SQLDBM, dbt docs, or similar to create ERDs and architecture visuals. Ability to confidently present and defend data model decisions in technical review and stakeholder walkthroughs. Strong verbal and written communication skills in English. Ability to work independently, lead discussions with minimal handholding, and resolve ambiguity in business requirements. Nice to Have Experience with metadata-driven modeling and data governance initiatives. Exposure to modeling strategies that support Power BI, KPI tracking, and cross-platform analytics. Knowledge of data lineage mapping, version control of models, and model lifecycle management. Skills: data warehousing,snowflake,modeling,elt,enterprise data,enterprise,snowflake sql,dimensional modeling,sqldbm,performance optimization,metadata frameworks,data vault,models,data,normalization,lucidchart,data governance,dbt docs,etl,architecture,data modeling
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough