Jobs
Interviews

901 Schema Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 10.0 years

10 - 15 Lacs

Panchkula

Work from Office

Job Description We are looking for a skilled and experienced ETL Engineer to join our growing team at Grazitti Interactive. In this role, you will be responsible for building and managing scalable data pipelines across traditional and cloud-based platforms. You will work with structured and unstructured data sources, leveraging tools such as SQL Server, Snowflake, Redshift, and BigQuery to deliver high-quality data solutions. If you have hands-on experience in Python, PySpark, and cloud platforms like AWS or GCP, along with a passion for transforming data into insights, we d love to connect with you. Key Skills Strong experience (4 10 years) in ETL development using platforms like SQL Server, Oracle, and cloud environments like Amazon S3, Snowflake, Redshift, Data Lake, and Google BigQuery. Proficient in Python, with hands-on experience creating data pipelines using APIs. Solid working knowledge of PySpark for large-scale data processing. Ability to output results in various formats, including JSON, data feeds, and reports. Skilled in data manipulation, schema design, and transforming data across diverse sources. Strong understanding of core AWS/Google Cloud Services and basic cloud architecture. Capable of developing, deploying, and debugging cloud-based data assets. Expert-level proficiency in SQL with a solid grasp of relational and cloud-based databases. Excellent ability to understand and adapt to evolving business requirements. Strong communication and collaboration skills, with experience in onsite/offshore delivery models. Familiarity with Marketo, Salesforce, Google Analytics, and Adobe Analytics. Working knowledge of Tableau and Power BI for data visualization and reporting. Roles and Responsibilities Design and implement robust ETL processes to ensure data integrity and accuracy across systems. Develop reusable data solutions and optimize performance across traditional and cloud environments. Collaborate with cross-functional teams, including data analysts, marketers, and engineers, to define data requirements and deliver insights. Take ownership of end-to-end data pipelines, from requirement gathering to deployment and monitoring. Ensure compliance with internal QMS and ISMS standards. Proactively report any data incidents or concerns to reporting managers. Life at Grazitti Share Your Profile We are always looking for the best talent to join our team

Posted 4 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Chennai

Work from Office

University degree in Computer Science, IT, or related field. 3+ years of hands-on ABAP development experience . Problem-solving mindset and excellent communication skills in English. ABAP knowledge is great, but experience with JavaScript, web technologies, or cloud basics is a big plus. Preferred: SAP certifications, S/4HANA exposure, or basic German proficiency. Benefits Competitive salary & flexible work arrangements. Career growth in a global SAP leader. Collaborative culture bridging German precision and Indian innovation. Play a key role in building our India presence from the ground up!

Posted 4 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Noida

Work from Office

Sr. Database Administrator Job Description: Key Responsibilities: Install, configure, and maintain MongoDB and other databases . Ensure database performance, security, and scalability . Implement replication, sharding, and backup strategies . Optimize queries, indexing, and storage for efficiency. Monitor database health using tools like Ops Manager, Prometheus, or Grafana . Troubleshoot database issues and ensure high availability . Automate database management tasks using Shell/Python scripting . Collaborate with development teams to optimize schema design and queries . Requirements: 4-6 years of experience in database administration . Strong expertise in MongoDB (preferred), MySQL, or PostgreSQL . Hands-on experience with replication, sharding, and high availability . Knowledge of backup, restore, and disaster recovery strategies . Experience in Linux environments and scripting (Shell, Python) . Familiarity with MongoDB Atlas, AWS RDS, or cloud-based databases . Preferred Qualifications: MongoDB Certification is a plus. Experience with DevOps tools like Docker, Kubernetes, or Ansible . Exposure to both SQL and NoSQL databases . Experience Range: 4 - 6 years Educational Qualifications: Any graduation Skills Required: Big Data

Posted 4 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Kolkata

Work from Office

Take charge of our organic growth strategies. Optimize websites, improve search rankings, and drive traffic using the latest SEO tactics including on-page, off-page, and technical SEO... About the Role ElevateDigi is looking for a results-driven SEO Specialist. This role requires a blend of analytical thinking, technical SEO knowledge, and content optimization expertise. Key Responsibilities Conduct keyword research and optimize on-page elements (titles, metas, headers, internal links). Improve website speed, mobile-friendliness, and core web vitals. Perform content optimization and implement structured data (Schema Markup). Develop and execute link-building strategies. Monitor backlink profile and perform technical SEO audits (crawl errors, broken links, sitemaps, robots.txt). Use tools like Google Analytics, Search Console, SEMrush, Ahrefs for tracking and reporting. Stay updated on search engine algorithm changes. Requirements 2+ years of proven SEO experience. Strong knowledge of SEO tools (Google Analytics, Search Console, Ahrefs, SEMrush, Moz, Screaming Frog). Understanding of HTML, CSS, JavaScript, and website architecture. Experience with WordPress, Shopify, and other CMS platforms. Ability to conduct competitor analysis. Strong analytical and problem-solving skills. Bonus Consideration: Web Development Experience Candidates with basic to advanced web development skills (HTML, CSS, JavaScript, WordPress, Shopify) will be given preference. Work with a dynamic team that values innovation.

Posted 4 weeks ago

Apply

15.0 - 20.0 years

45 - 55 Lacs

Pune, Bengaluru

Work from Office

As a Principal Architect - Mernstack, you will collaborate closely with key stakeholders to design and develop a core product for one of our valued clients Your role will require deep technical expertise and a strategic mindset to ensure scalability, performance, and exceptional user experience Responsibilities: - * Lead the architectural design and development of a scalable, high-performance product * Spearhead the modernization of the Radiology Worklist Program s user interface by developing a Micro Frontend (MFE) application using React * Seamlessly integrate the new React-based MFE with existing Angular components, leverage Material-UI (MUI) for a consistent design system * Enhance and refactor backend services built in Node js to support new features, including customizable user-defined worklists * Conduct a comparative analysis of AlloyDB and PostgreSQL to determine the optimal database solution for advanced analytics * Redesign and optimize the database schema to support new functionalities and improve overall system efficiency * Design and implement an intermediate caching layer using Elasticsearch to boost application performance and scalability Experience: - * 15 - 20 Years Educational Qualifications: - * Engineering Degree - BE/ME/ BTech/MTech/ BSc/MSc * Technical certification in multiple technologies is desirable Skills: -Mandatory Technical Skills: - * Node.js, ReactJS, PostgreSQL * strong focus on user experience Good to Have Skills: - * AlloyDB

Posted 4 weeks ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Mumbai

Work from Office

Design and develop event streaming solutions using Apache Kafka and Confluent Kafka. Build producers, consumers, and Kafka Streams applications using Java, Python, or Scala. Create and manage Kafka topics, schemas (Avro/Protobuf/JSON), and partitions for performance and scalability. Implement Kafka Connect to integrate with data sources (eg, databases, S3, etc). Use Confluent Schema Registry to enforce data serialization standards. Manage Kafka infrastructure (Confluent Cloud or On-prem), ensuring high availability and failover. Monitor and optimize Kafka performance, troubleshoot lag, throughput issues, and errors. Work with DevOps teams to automate deployments using CI/CD tools. Document solutions, best practices, and integration patterns. Collaborate with backend engineers, data engineers, and architects to deliver reliable streaming systems.

Posted 4 weeks ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Mumbai

Work from Office

We are seeking a skilled and motivated Kafka Engineer to design, implement, and maintain robust event-driven data pipelines and real-time streaming applications. The ideal candidate will have a deep understanding of Apache Kafka, data streaming architectures, and distributed systems, and will play a critical role in ensuring the high availability, performance, and scalability of our data infrastructure. Key Responsibilities: Design, develop, and manage Kafka-based data streaming solutions . Set up, configure, and maintain Kafka clusters , brokers, topics, and schemas. Monitor, optimize, and ensure the availability and performance of Kafka services. Implement Kafka Connect , Kafka Streams , and Schema Registry as needed. Develop producers and consumers for real-time data ingestion and processing. Collaborate with DevOps, Data Engineering, and Backend teams to integrate Kafka with other systems (eg, Spark, Flink, MongoDB, PostgreSQL). Automate deployment, scaling, and recovery of Kafka components. Implement monitoring and alerting tools (eg, Prometheus, Grafana, Splunk). Ensure data security , compliance , and governance across Kafka pipelines. Participate in on-call rotations and provide support for production systems. Required Skills & Qualifications: 2+ years of hands-on experience with Apache Kafka in production environments. Strong programming skills in Java , Scala , or Python . Experience with Kafka Connect , Kafka Streams , and Kafka REST Proxy . Solid understanding of distributed systems, messaging systems, and real-time data processing. Familiarity with Docker , Kubernetes , and CI/CD pipelines . Experience with monitoring tools like Confluent Control Center , Prometheus , Grafana , or Datadog . Proficiency with Linux/Unix systems and scripting.

Posted 4 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 4 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Job Description Job Profile - Lead Data Engineer Does working with data on a day to day basis excite you? Are you interested in building robust data architecture to identify data patterns and optimise data consumption for our customers, who will forecast and predict what actions to undertake based on data? If this is what excites you, then you ll love working in our intelligent automation team. Schneider AI Hub is leading the AI transformation of Schneider Electric by building AI-powered solutions. We are looking for a savvy Data Engineer to join our growing team of AI and machine learning experts. You will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software engineers, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities Create and maintain optimal data pipeline architecture; assemble large, complex data sets that meet functional / non-functional requirements. Design the right schema to support the functional requirement and consumption patter. Design and build production data pipelines from ingestion to consumption. Create necessary preprocessing and postprocessing for various forms of data for training/ retraining and inference ingestions as required. Create data visualization and business intelligence tools for stakeholders and data scientists for necessary business/ solution insights. Identify, design, and implement internal process improvements: automating manual data processes, optimizing data delivery, etc. Ensure our data is separated and secure across national boundaries through multiple data centers Requirements and Skills You should have a bachelors or master s degree in computer science, Information Technology or other quantitative fields You should have at least 8 years working as a data engineer in supporting large data transformation initiatives related to machine learning, with experience in building and optimizing pipelines and data sets Strong analytic skills related to working with unstructured datasets. Experience with Azure cloud services, ADF, ADLS, HDInsight, Data Bricks, App Insights etc Experience in handling ETL s using Spark. Experience with object-oriented/object function scripting languages: Python, Pyspark, etc. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. You should be a good team player and committed for the success of team and overall project. About Us Schneider Electric creates connected technologies that reshape industries, transform cities and enrich lives. Our 144, 000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Great people make Schneider Electric a great company. . Schedule: Full-time Req: 0098TK

Posted 4 weeks ago

Apply

6.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

We are seeking a skilled and motivated Lead Software Developer (Python) with 6-8 years of hands-on experience in designing, developing, and deploying Python-based microservices. The ideal candidate should have expertise working with cloud-native architectures using Docker and Kubernetes, and integrating services with Large Language Models (LLMs) via OpenAI APIs. You will lead the design, development, and deployment of scalable services while working in an agile/scrum environment. Essential Duties and Responsibilities: Lead the design and implementation of Python microservices hosted on Kubernetes or Docker environments. Develop and maintain python micro-services where communication between microservices uses RabbitMQ Design and optimize database schemas; implement data access layers using PostgreSQL and MongoDB. Integrate LLM capabilities via OpenAI or similar APIs into microservices. Write unit, integration, and system tests; ensure code quality and maintainability. Track work progress and maintain up-to-date tasks on Azure Boards (or similar work item tracking systems). Manage source code repositories, branching strategies, pull requests, and reviews using Git-based tools (Azure Repos, GitHub, or similar). Build, configure, and maintain CI/CD pipelines using Azure Pipelines for automated testing and deployments. Participate in Agile ceremonies (sprint planning, stand-ups, retrospectives) and collaborate effectively with cross-functional teams. Mentor and guide junior developers on coding standards, best practices, and architecture decisions. Required experience: Programming Skills: Strong expertise in Python (3.x), with knowledge of best practices for building scalable services. Application of proven programming principles and patterns Prompt engineering skill to generate code using Cursor, Co-Pilot or similar tools. Design Skills: Strong expertise in OOP with Python. Strong expertise in code design /modelling skills using UML or similar tools Frameworks/Libraries: Experience with popular Python frameworks - SQL Alchemy, Alembic Experience with building custom python packages. Containerization & Orchestration: Hands-on experience with Docker, Kubernetes (AKS or self-hosted). Messaging Systems: Proven experience using RabbitMQ for asynchronous service communication. Databases: Relational: PostgreSQL (schema design, performance tuning). NoSQL: MongoDB (data modelling, CRUD operations). APIs & LLMs: Integration of microservices with LLMs or OpenAI APIs; handling authentication, request/response flows, and prompt engineering basics In-depth Knowledge and Experience in the following areas: High level understanding of following tech stack Source control with Git (Azure Repos, GitHub). Building and deploying using Azure Pipelines or similar CI/CD tools. Familiarity with container registries (ACR, Docker Hub). Work item tracking and task management using Azure Boards (or Jira). Experience working in Agile/Scrum methodologies, including sprint ceremonies, story point estimation, and continuous delivery. Additional skills: Excellent problem-solving and troubleshooting / debugging skills. Strong understanding of RESTful API design principles. Familiarity with observability tools (logging, metrics, tracing) is a plus. Excellent verbal and written communication skills. Ability to lead technical discussions and present solutions effectively.

Posted 4 weeks ago

Apply

7.0 - 12.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Industry: Software Quality Assurance & Automation Testing A fast-growing provider of full-stack digital engineering services, we help global enterprises accelerate software delivery by embedding intelligent automation throughout the SDLC. Our Quality Engineering practice designs robust test automation frameworks that ensure flawless user experiences across web, mobile, and API layers. Standardized Title: Cypress Automation Engineer Location: On-site, HYD, Bangalore Role & Responsibilities Design, develop, and maintain scalable Cypress test suites covering UI, functional, and regression scenarios for React/Angular web apps. Integrate automated tests into CI/CD pipelines (Jenkins/GitHub Actions) to enable shift-left quality and rapid feedback loops. Author robust API tests leveraging Cypress and REST/GraphQL modules to validate backend services and contract compliance. Collaborate with developers and product owners in Agile ceremonies, converting acceptance criteria into executable test cases and BDD specs. Monitor test execution, triage failures, debug JavaScript code, and drive root-cause analysis to closure. Champion automation best practices, code reviews, and metrics dashboards while mentoring junior engineers on Cypress and modern QA tooling. Skills & Qualifications Must-Have 7+ years automation testing with Cypress and JavaScript/TypeScript. Hands-on with DOM selectors, async assertions, fixtures, and custom commands. Proficient in RESTful API testing, JSON schema validation, and Postman/Swagger. CI/CD exposure using Jenkins, GitLab, or GitHub Actions with Docker containers. Solid understanding of Agile/Scrum, story pointing, and defect life-cycle. Git workflow mastery and experience reviewing merge requests. Preferred Experience with visual regression tools (Percy, Applitools) or BDD (Cucumber). Knowledge of cloud browsers and cross-device labs such as BrowserStack or Sauce Labs. Performance test basics with Lighthouse or Web Vitals. Familiarity with micro-frontends and contract testing (PACT). Benefits & Culture Highlights Work with cutting-edge QA automation stacks and mentorship from industry veterans. Clear career progression roadmap into SDET and DevOps roles. Collaborative, innovation-first culture that rewards continuous learning. Ready to engineer zero-defect releases? Apply now to join our on-site QA center of excellence in India.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

14 - 16 Lacs

Bengaluru

Work from Office

Job Description: 8-12 Years experience in . Net Technologies Hands-on service design, schema design and application integration design Hands-on software development using C#, . Net Core Use of multiple Cloud native database platforms including DynamoDB, SQL, Elasticache, and others Hands-on application design for high availability and resiliency Hands-on problem resolution across a multi-vendor ecosystem Conduct Code reviews and peer reviews Unit test and Unit test automation, defect resolution and software optimization Actively engaged with Client IT and Client Business during daily work sessions Code deployment using CI/CD processes Contribute to each step of the development process from ideation to implementation to release, including rapidly prototyping, running A/B tests, continuous Integration, Automated Testing and Continuous Delivery Understand business requirements and technical limitations Ability to learn new technologies and influence the team and leadership to constantly implement modern solutions Experience in using Elasticsearch, Logstash, Kibana (ELK) stack for Logging and Analytics Experience in container orchestration using Kubernetes Knowledge and Experience working with public cloud AWS services Knowledge of Cloud Architecture and Design Patterns Ability to prepare documentation for Microservices Monitoring tools such as Datadog, Logstash Excellent Communication skills Airline industry knowledge is preferred but not required At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 4 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Noida

Work from Office

Having experience in ServiceNow Development and ability to drive solutions independently. Has worked on ITSM extensively and understanding of CMDB. Also, is aware of other ServiceNow products and having practical work experience would be added advantage. Rich Experience with ServiceNow client and server-side JavaScript and the ServiceNow APIs Experience with extending the ServiceNow schema to custom applications and working on ServiceNow platform capabilities and implementation of Scoped Application Experience in managing flows and workflows of Medium to Complex in nature. Understand scripted Web-Services, such as AJAX, Business Rules, JavaScript, SOAP, REST SSO-SAML Setup and Integration of ServiceNow to Other Applications Understanding of Service Portal designing would be an added advantage. Candidate must have general development experience. System integration experience using web services and other web-based technologies such as XML, HTML, AJAX, CSS, HTTP, REST/SOAP Ability to take role of Solution Architect and deliver projects for implementation and enhancements for customers along with Project Managers. Proficient in JavaScript with understanding on ServiceNow scripting. Must have some experience working with relational databases. Required Certifications and Knowledge: ServiceNow - Certified System Administrator ServiceNow - ITSM preferred or any other Implementation Specialist Working in Agile Team and Scrum Framework. Total Experience Expected: 08-10 years

Posted 4 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Bengaluru

Work from Office

We are seeking skilled SQL Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects. Key Responsibilities: Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of SQL development experience. Expertise in writing complex SQL queries, optimizing database performance Designing database schemas Proficient in working with relational database management systems like MySQL, PostgreSQL, or SQL Server

Posted 4 weeks ago

Apply

1.0 - 3.0 years

3 - 7 Lacs

Mumbai

Work from Office

We are seeking skilled SQL Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects. Key Responsibilities: Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of SQL development experience. Expertise in writing complex SQL queries, optimizing database performance Designing database schemas Proficient in working with relational database management systems like MySQL, PostgreSQL, or SQL Server

Posted 4 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Hyderabad

Work from Office

We are seeking skilled SQL Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects. Key Responsibilities: Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of SQL development experience. Expertise in writing complex SQL queries, optimizing database performance Designing database schemas Proficient in working with relational database management systems like MySQL, PostgreSQL, or SQL Server

Posted 4 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Kolkata

Work from Office

We are seeking skilled SQL Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects. Key Responsibilities: Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of SQL development experience. Expertise in writing complex SQL queries, optimizing database performance Designing database schemas Proficient in working with relational database management systems like MySQL, PostgreSQL, or SQL Server

Posted 4 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Ahmedabad

Work from Office

We are looking for a Data Engineer / PowerBI Developer with 3 to 5 years of experience to join our team. The ideal candidate will be responsible for designing, building, and optimizing data pipelines, ensuring efficient data flow across various systems. Key Responsibilities: Design and develop ETL Operations using Azure Data Factory or similar technology Ability to work with different REST APIs to gather data Design, develop, and publish interactive dashboards and reports using Power BI. Write efficient and optimized SQL queries to extract and manipulate data from relational databases. Build and maintain data models, DAX calculations, and Power BI datasets. Perform data analysis, validation, and ensure the quality and accuracy of reports. Connect Power BI reports to multiple data sources such as SQL Server, Azure SQL, Excel, APIs, Snowflake, and Databricks. Optimize Power BI dashboards, SQL queries, and dataflows for performance and scalability. Collaborate with business stakeholders to gather reporting requirements and translate them into technical solutions. Troubleshoot data-related and report performance issues; ensure timely resolution. Document report specifications, data models, business logic, and technical processes. Stay updated with new features and best practices in Power BI, Azure, Snowflake, AWS ecosystems. Requirements: Experience in implementing ETL operations Experience working with APIs to collect data Knowledge of data visualization best practices and UI/UX design principles. Exposure to data warehouse concepts (e.g., Star Schema, Snowflake Schema). Experience implementing Row-Level Security (RLS) in Power BI reports. Minimum 3 years experience with Power BI Minimum 3 years experience in Data Warehousing and ETL setup Experience working with SQL Server, Azure SQL and Microsoft technology stack

Posted 4 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

About Conneqtion Conneqtion Group is a trusted Oracle Cloud Infrastructure (OCI) & Oracle SaaS Implementation Partner, dedicated to helping businesses implement cutting-edge digital solutions in ERP, AI & Analytics. With a strong presence in EMEA, APAC and NA, our consulting and services firm specializes in Oracle technologies. Our experienced team excels in Oracle Cloud implementation, utilizing the OUM methodology to deliver innovative transformations for our clients. Conneqtion Group has successfully provided Oracle professional services to over 50 SMEs and large-scale enterprises, driving efficiency and growth through tailored digital solutions. At Conneqtion Group, we harness innovation to bring about meaningful change for our clients, ensuring their systems are optimized for peak performance and future growth . Position Overview: We are seeking a skilled Oracle APEX Consultant to join our dynamic development team. The ideal candidate will have extensive experience in designing and developing applications built using Oracle APEX. You will work closely with other developers, project managers, and business analysts to create high-quality web applications. Responsibilities: Minimum 2-4 years of relevant experience in designing and developing commercial grade web applications using Oracle APEX Oracle APEX (APEX versions 21,22,23,24) Oracle SQL Oracle PL/SQL including packages, procedures, functions, triggers, Develop Forms, Reports, Charts, Interactive Grids, REST, APEX workflows with ease Ability to deliver modern elegant secure web applications using a combination of APEX, JavaScript, PL/SQL, REST services in a high performing environment Develop scripts in SQL, PLSQL for data conversion, data migration, performance tuning Experience and knowledge in HTML5 /CSS Design APEX Authorization & Authentication schemes Using REST services for Integrations Experience in managing web page layouts with APEX templates and themes. Perform Requirements Analysis, Development, Design, Testing and Deployment of custom Oracle applications across environments Experience in modernizing Oracle Forms to APEX is a plus Design and maintain custom Oracle applications using Oracle APEX. Document supporting policies, processes, and procedures within your areas of responsibility. Effective creation of dynamic reports within the Oracle APEX environment, using APEX reporting tools, BI Publisher, Jasper Adherence to policies, processes, and procedures within your areas of responsibility. Qualifications: Bachelor s degree in computer science, Information Systems or a related field Minimum 2+ Years of experience in Oracle APEX web application development with version 18.1 or above required Strong Proficiency in SQL and Oracle PL/SQL programming skills, performance tuning, database schema design principles Good understanding of JavaScript, HTML5 and CSS Knowledge of APEX workflows, Forms to APEX, Integration with EBS, Fusion is a plus Knowledge of other scripting languages will be an added advantage. Use Best Design Practices and reusable components within applications Experience collaborating effectively with team members and partners in a distributed project environment. Strong Communication skills Oracle APEX Cloud Developer Professional Certification is a plus (not mandatory) Conneqtion s Diversity & Inclusion Statement At Conneqtion, diversity and inclusion are at the heart of our culture. As an equal opportunity employer, we take pride in fostering a workplace where everyone is valued and respected. Our DEI initiative is dedicated to promoting equality, embracing diversity, and creating an inclusive environment for all. We believe that a diverse workforce drives innovation and success, encouraging applicants from all backgrounds, including different races, ethnicities, religions, genders, sexual orientations, abilities, and experiences. To empower our global team, we offer flexible work arrangements, mentorship, career mobility, and continuous learning opportunities. At Conneqtion, employment decisions are based on merit, ensuring a fair and inclusive hiring process for all.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

We re looking for a driven Infrastructure Engineer to architect, implement, and maintain powerful observability systems that safeguard the performance and reliability of our most critical systems. In this role, you ll take real ownership collaborating with cross-functional teams to shape best-in-class observability standards, troubleshoot complex issues, and fine-tune monitoring tools to exceed SLA requirements. If you re ready to design high-quality solutions, influence our technology roadmap, and make a lasting impact on our product s success, we want to meet you! Responsibilities: Improve alerting across SentiLink systems and services, developing high quality monitoring capabilities while actively reducing false positives. Troubleshoot, debug, and resolve infrastructure issues as they arise; participate in on-call rotations for production issues. Define and refine Service Level Indicators (SLI), Service Level Objectives (SLO), and Service Level Agreements (SLA) in collaboration with product and engineering teams. Develop monitoring and alerting configurations using IaC solutions such as Terraform. Build and maintain dashboards to provide visibility into system performance and reliability. Collaborate with engineering teams to improve root cause analysis processes and reduce Mean Time to Recovery (MTTR). Drive cost optimization for observability tools like Datadog, CloudWatch, and Sumo Logic. Perform capacity testing to determine a deep understanding of infrastructure performance under load. Develop alerting based on learnings. Oversee, develop, and operate Kubernetes and service mesh infrastructure, ensuring smooth performance and reliability Investigate operational alerts, identify root causes, and compile comprehensive root cause analysis reports. Pursue action items relentlessly until they are thoroughly completed Conduct in-depth examinations of database operational issues, actively developing and improving database architecture, schema, and configuration for enhanced performance and reliability Develop and maintain incident response runbooks and improve processes to minimize service downtime. Research and evaluate new observability tools and technologies to enhance system monitoring. Requirements: 5 years of experience in cloud infrastructure, DevOps, or systems engineering. Expertise in AWS and infrastructure-as-code development. Experience with CI/CD pipelines and automation tools. Experience managing observability platforms, building monitoring dashboards, and configuring high quality, actionable alerting Strong understanding of Linux systems and networking. Familiarity with container orchestration tools like Kubernetes or Docker. Excellent analytical and problem-solving skills. Experience operating enterprise-size databases. Postgres, Aurora, Redshift, and OpenSearch experience is a plus Experience with Python or Golang is a plus Perks: Employer paid group health insurance for you and your dependents 401(k) plan with employer match (or equivalent for non US-based roles) Flexible paid time off Regular company-wide in-person events Home office stipend, and more!

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Pune

Work from Office

Confluent Kafka Specialist Pune, Maharashtra, India Job Description Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solute zions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills . Required Skills CDC, Kafka Connect, Kafka Streams, Schema Registry, Debezium, GoldenGate, Qlik Replicate, troubleshoot problems, SQL, NoSQL Databases (e.g. Mongo), Micro Services, API Integration

Posted 1 month ago

Apply

5.0 - 10.0 years

40 - 45 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

At GoDaddy the future of work looks different for each team. Some teams work in the office full-time, others have a hybrid arrangement (they work remotely some days and in the office some days) and some work entirely remotely. This is a remote position, so you ll be working remotely from your home. You may occasionally visit a GoDaddy office to meet with your team for events or meetings. Join our team... As a data-driven company, GoDaddy is seeking a dedicated and highly motivated analyst to join our Business Analytics team. On the BA team, you will be integral to our company growth through the insights you uncover and the thoughtful recommendations you make. You know and love working with analytic tools, can write excellent SQL and scripts, have strong statistical skills, are very effective and efficient at crafting visual dashboards, and can utilise your technical skills and creative approaches to drive product strategy, reduce customer attrition and find new revenue opportunities. Know the latest on AI trends and standard processes and love to find ways to apply AI to your work. You will join forces with leaders across Product, Engineering, Marketing, Strategy, Finance and more. You must be able to tell a clear and succinct story from the data and communicate findings to the executive team. The right person will raise the profile and perfection of our entire team. You can make a difference here! What youll get to do... Contribute to GoDaddy growth by understanding business objectives, developing analytical project requirements, providing answers to business questions, making recommendations, and joining forces to business improvement and optimization. Use data to tell a story. Conduct in-depth analysis to identify actionable insights, suggest experimentation ideas, make recommendations, and influence the direction of the business by optimally communicating findings to global cross functional groups. Apply proven coding and strong analytical skills to learn about customers and turn innovative ideas into working solutions. Build multifaceted, rich dashboards that show results in easy-to-understand visualisations that help our partners understand their business and its drivers. Build end-to-end data solutions from analytic-designed data sets to the design, development and implementation of enterprise-wide views and custom reporting.Work with a team of analysts in the collection and dissemination of company performance via multiple reporting tools/methodologies. Perform large-scale data analysis and develop effective models for segmentation, classification, optimization, time series, customer behavior, etc. See opportunities to use AI tools & techniques to generate insights and build models, integrate AI-powered features into customer experiences, and more. Your experience should include... 5+ years working in a technical capacity in corporate setting OR a masters degree or equivalent experience in a related field with 3+ years technical work experience Proficiency in using SQL to discover, aggregate and extract data is a MUST. Large dataset experience is a plus. Knowledge and experience in AWS/Redshift, Alation is preferred. Proficiency with timely engineering to perform sophisticated analytical tasks. Experience with data visualization and business intelligence tools like Tableau, Google Analytics, or other programs. Familiarity with analytical techniques, including trend analysis, forecasting, regression, and experiment design (A|B tests) and analysis. Knowledge in Python or R. Familiarity with ETL implementation and maintenance and a working understanding of schema design and dimensional data modeling. Advanced users of Microsoft Office tools such as Excel and PowerPoint. Ability to partner and collaborate across teams in multiple time zones, context switching between technical discussions of databases and queries to business discussions about customer behavior and revenue generation. You might also have... Bachelor s degree or equivalent experience in a quantitative field such as Mathematics, Statistics, Computer Science, Data Science, Engineering, Finance, Economics, etc. is required. Weve got your back... We offer a range of total rewards that may include paid time off, retirement savings (e.g., 401k, pension schemes), bonus/incentive eligibility, equity grants, participation in our employee stock purchase plan, competitive health benefits, and other family-friendly benefits including parental leave. GoDaddy s benefits vary based on individual role and location and can be reviewed in more detail during the interview process.

Posted 1 month ago

Apply

5.0 - 6.0 years

2 - 5 Lacs

Pune

Work from Office

The job responsibilities will include (but not be limited to): Confirmation over dispatch date with customer Co-ordinating for payment status & vehicle arrangement Vehicle arrangement with right Vehicle size Vendor Registration documentation Material readiness before vehicle arrival Confirmation on delivery to customer Co-ordination for PI & POCL Mandatory Requirements: Customer Interaction & Engagement Co-ordination with all"/> Skip to content Careers Contact Us Our Portals IT Service Desk PISF Resources Blogs Case Studies Download Centre Toggle Navigation About Us From the CMD s Desk Company Overview Our Journey Culture and Leadership Our Customers Corporate Social Responsibility Products Expanded Polypropylene (EPP) Expanded Polystyrene (EPS) Rotomould Services End-to-End Solutions Value Engineering Assembly Services Sustainability Initiatives Lean and Green RecyCole Get in Touch Previous Next Engineer Sales Support The job responsibilities will include (but not be limited to): Confirmation over dispatch date with customer Co-ordinating for payment status & vehicle arrangement Vehicle arrangement with right Vehicle size Vendor Registration documentation Material readiness before vehicle arrival Confirmation on delivery to customer Co-ordination for PI & POCL Mandatory Requirements: Customer Interaction & Engagement Co-ordination with all stakeholders Planning Skills Communication Job Location: Rotational Moulding - Urse Pune Job Type: Full Time Education: B. E./B. Tech(Mechanical / Electrical) Experience: 5 to 6 Years Function: Sales Apply for this position First Name: * Middle Name: * Last Name: * Mobile Number: * Email Address * Gender: * Male Female Other Position Applied For: * Current Company: * Current Designation: * Current Location: * Willing to Relocate? * Yes No Total Years of Experience: * Current Annual Fixed CTC (in lacs): * Notice Period * Highest Educational Qualification: * Top 5 Skill Sets: * Resume * Drop files here or click to upload Maximum allowed file size is 100 MB. Allowed Type(s): .pdf, .doc, .docx By using this form, you agree with the storage and handling of your data by this website. * Amuratech 2025-01-23T10:13:59+00:00 #KKraftingPolymerPossibilities About Us Toggle Navigation From the CMD s Desk Company Overview Our Journey Culture and Leadership Our Customers Corporate Social Responsibility Products Toggle Navigation Expanded Polypropylene Expanded Polystyrene Rotomould Services Toggle Navigation End-to-End Solutions Value Engineering Assembly Services Quick Links Toggle Navigation Annual Returns CSR Policy PISF Submission PISF Tracker Privacy Policy Terms and Conditions Contact Us K. K. . At the centre, waste materials are segregated into nine different categories, namely, paper, cardboard, plastic, glass, broken glass, ewaste, metal, tetra pack and thermocole. Citizens can drop off their waste on the weekends and the collected material is handed over to responsible recyclers. When RRC found out about RecyCole, they asked us to become their thermocole waste recycling partner. Over the last three years, we have collected approximately 4.5 tonnes of EPS waste material from this centre. 2 years ago we partnered with a Pune-based OEM for their EPS waste disposal Golde Automotive India Private Limited, a major automotive OEM that manufactures roof systems, was facing an issue with the disposal of their EPS packaging. This organisation imports glass for their sunroofs from China on a regular basis, which leads to copious amounts of Thermocole waste. Our team was able to successfully identify the company s predicament and offer immediate support in terms of multiple collections of EPS waste from their plant every week. Over the last 2 years, we have successfully collected and recycled 10 tonnes of EPS waste, contributing significantly to reducing the OEM s carbon footprint. Search for: Share on Facebook Share on Twitter Share via Email {"@context":"http:\ / \ / schema.org\ / " , "@type":"JobPosting" , "title":"Engineer Sales Support","description":" The job responsibilities will include (but not be limited to): Confirmation over dispatch date with customer Co-ordinating for payment status & vehicle arrangement Vehicle arrangement with right Vehicle size Vendor Registration documentation Material readiness before vehicle arrival Confirmation on delivery to customer Co-ordination for PI & POCL Mandatory Requirements: Customer Interaction & Engagement Co-ordination with all stakeholders Planning Skills Communication ",

Posted 1 month ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Pune

Work from Office

The job responsibilities will include (but not be limited to): Design and development of Rotational Moulding moulds and Blow Moulds. Interaction with sales and customers to understand the customer needs in detail. Check the feasibility of product with respect to manufacturing. Prepare detailed DFM and communicate with internal and external customers. He should know how"/> Skip to content Careers Contact Us Our Portals IT Service Desk PISF Resources Blogs Case Studies Download Centre Toggle Navigation About Us From the CMD s Desk Company Overview Our Journey Culture and Leadership Our Customers Corporate Social Responsibility Products Expanded Polypropylene (EPP) Expanded Polystyrene (EPS) Rotomould Services End-to-End Solutions Value Engineering Assembly Services Sustainability Initiatives Lean and Green RecyCole Get in Touch Previous Next Engineer Design The job responsibilities will include (but not be limited to): Design and development of Rotational Moulding moulds and Blow Moulds. Interaction with sales and customers to understand the customer needs in detail. Check the feasibility of product with respect to manufacturing. Prepare detailed DFM and communicate with internal and external customers. He should know how to estimate and the mould cost and communicate with customers. He should coordinate with Internal customer and Tool room manufacturing team for the successful completion of the project. He should have sampling knowledge and troubleshooting of the sampling issues specific to Moulds. He should be adaptable to any software s like Mould sanction software, ERP etc. He should be creative enough and must have Product modeling knowledge. He should have fair knowledge of different tests required for Roto moulded products and coordinate as and when required. He should be creative enough to design moulds keeping consideration of Value addition to customer. Job Location: Engineering Support - Urse Pune Job Type: Full Time Education: Diploma - Mechanical Experience: 5 to 6 Years Function: Design Apply for this position First Name: * Middle Name: * Last Name: * Mobile Number: * Email Address * Gender: * Male Female Other Position Applied For: * Current Company: * Current Designation: * Current Location: * Willing to Relocate? * Yes No Total Years of Experience: * Current Annual Fixed CTC (in lacs): * Notice Period * Highest Educational Qualification: * Top 5 Skill Sets: * Resume * Drop files here or click to upload Maximum allowed file size is 100 MB. Allowed Type(s): .pdf, .doc, .docx By using this form, you agree with the storage and handling of your data by this website. * Amuratech 2025-03-27T07:59:44+00:00 #KKraftingPolymerPossibilities About Us Toggle Navigation From the CMD s Desk Company Overview Our Journey Culture and Leadership Our Customers Corporate Social Responsibility Products Toggle Navigation Expanded Polypropylene Expanded Polystyrene Rotomould Services Toggle Navigation End-to-End Solutions Value Engineering Assembly Services Quick Links Toggle Navigation Annual Returns CSR Policy PISF Submission PISF Tracker Privacy Policy Terms and Conditions Contact Us K. K. . At the centre, waste materials are segregated into nine different categories, namely, paper, cardboard, plastic, glass, broken glass, ewaste, metal, tetra pack and thermocole. Citizens can drop off their waste on the weekends and the collected material is handed over to responsible recyclers. When RRC found out about RecyCole, they asked us to become their thermocole waste recycling partner. Over the last three years, we have collected approximately 4.5 tonnes of EPS waste material from this centre. 2 years ago we partnered with a Pune-based OEM for their EPS waste disposal Golde Automotive India Private Limited, a major automotive OEM that manufactures roof systems, was facing an issue with the disposal of their EPS packaging. This organisation imports glass for their sunroofs from China on a regular basis, which leads to copious amounts of Thermocole waste. Our team was able to successfully identify the company s predicament and offer immediate support in terms of multiple collections of EPS waste from their plant every week. Over the last 2 years, we have successfully collected and recycled 10 tonnes of EPS waste, contributing significantly to reducing the OEM s carbon footprint. Search for: Share on Facebook Share on Twitter Share via Email {"@context":"http:\ / \ / schema.org\ / " , "@type":"JobPosting" , "title":"Engineer Design","description":" The job responsibilities will include (but not be limited to): Design and development of Rotational Moulding moulds and Blow Moulds. Interaction with sales and customers to understand the customer needs in detail. Check the feasibility of product with respect to manufacturing. Prepare detailed DFM and communicate with internal and external customers. He should know how to estimate and the mould cost and communicate with customers. He should coordinate with Internal customer and Tool room manufacturing team for the successful completion of the project. He should have sampling knowledge and troubleshooting of the sampling issues specific to Moulds. He should be adaptable to any software\u2019s like Mould sanction software, ERP etc. He should be creative enough and must have Product modeling knowledge. He should have fair knowledge of different tests required for Roto moulded products and coordinate as and when required. He should be creative enough to design moulds keeping consideration of Value addition to customer. ",

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Bengaluru

Work from Office

A Senior R&D Engineer, you will be responsible for designing, developing, and maintaining high-quality software solutions with an expertise in Java and Spring Boot. You also have experience in UML modeling, JSON schema, and NoSQL databases. You should have strong skills in cloud-native development and microservices architecture, with additional knowledge in scripting and Helm charts. You have: Bachelors or masters degree in computer science, Engineering, or a related field. 5+ years of experience in software development with a focus on Java and Spring Boot. Exposure to CI/CD tools (Jenkins, GitLab CI). It would be nice if you also had: Understanding of RESTful API design and implementation. Relevant certifications (e.g., AWS Certified Developer, Oracle Certified Professional Java SE) are a plus. Knowledge of container orchestration and management. Familiarity with Agile development methodologies. Design and develop high-quality applications using Java and Spring Boot, implementing and maintaining RESTful APIs and microservices. Create and maintain UML diagrams for software architecture, define and manage JSON schemas, and optimize NoSQL databases like Neo4j, MongoDB, and Cassandra for efficient data handling. Develop and deploy cloud-native applications using AWS, Azure, or OCP, ensuring scalability and resilience in microservices environments. Manage Kubernetes deployments with Helm charts, collaborate with DevOps teams to integrate CI/CD pipelines, and automate tasks using Python and Bash scripting. Ensure efficient data storage and retrieval, optimize system performance, and support automated deployment strategies. Maintain comprehensive documentation for software designs, APIs, and deployment processes, ensuring clarity and accessibility.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies