Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Job Title: Principal Data Engineer (Java | AWS |SPARK |KAFKA | MySQL | ElasticSearch) About Skyhigh Security: Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the world s data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on LinkedIn and Twitter @SkyhighSecurity . Role Overview: Principal Data Engineer, you will be responsible for: Leading the design and implementation of high-scale, cloud-native data pipelines for real-time and batch workloads. Collaborating with product managers, architects, and backend teams to translate business needs into secure and scalable data solutions. Integrating big data frameworks (like Spark, Kafka, Flink) with cloud-native services (AWS/GCP/Azure) to support security analytics use cases. Driving CI/CD best practices, infrastructure automation, and performance tuning across distributed environments. Evaluating and piloting the use of AI/LLM technologies in data pipelines (e.g., anomaly detection, metadata enrichment, automation). Our Engineering team is driving the future of cloud security developing one of the world s largest, most resilient cloud-native data platforms. At Skyhigh Security, we re enabling enterprises to protect their data with deep intelligence and dynamic enforcement across hybrid and multi-cloud environments. As we continue to grow, we re looking for a Principal Data Engineer to help us scale our platform, integrate advanced AI/ML workflows, and lead the evolution of our secure data infrastructure. Responsibilities: As a Principal Data Engineer, you will be responsible for: Leading the design and implementation of high-scale, cloud-native data pipelines for real-time and batch workloads. Collaborating with product managers, architects, and backend teams to translate business needs into secure and scalable data solutions. Integrating big data frameworks (like Spark, Kafka, Flink) with cloud-native services (AWS/GCP/Azure) to support security analytics use cases. Driving CI/CD best practices, infrastructure automation, and performance tuning across distributed environments. Evaluating and piloting the use of AI/LLM technologies in data pipelines (e.g., anomaly detection, metadata enrichment, automation). Evaluate and integrate LLM-based automation and AI-enhanced observability into engineering workflows. Ensure data security and privacy compliance. Mentoring engineers, ensuring high engineering standards, and promoting technical excellence across teams. What We re Looking For (Minimum Qualifications) 10+ years of experience in big data architecture and engineering, including deep proficiency with the AWS cloud platform. Expertise in distributed systems and frameworks such as Apache Spark, Scala, Kafka, Flink, and Elasticsearch, with experience building production-grade data pipelines. Strong programming skills in Java for building scalable data applications. Hands-on experience with ETL tools and orchestration systems. Solid understanding of data modeling across both relational (PostgreSQL, MySQL) and NoSQL (HBase) databases and performance tuning. What Will Make You Stand Out (Preferred Qualifications) Experience integrating AI/ML or LLM frameworks (e.g., LangChain, LlamaIndex) into data workflows. Experience implementing CI/CD pipelines with Kubernetes, Docker, and Terraform. Knowledge of modern data warehousing (e.g., BigQuery, Snowflake) and data governance principles (GDPR, HIPAA). Strong ability to translate business goals into technical architecture and mentor teams through delivery. Familiarity with visualization tools (Tableau, Power BI) to communicate data insights, even if not a primary responsibility. #LI-MS1 Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each others unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement Were serious ab out our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.
Posted 1 week ago
5.0 - 8.0 years
7 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Veradigm Veradigm is here to transform health, insightfully. Veradigm delivers a unique combination of point-of-care clinical and financial solutions, a commitment to open interoperability, a large and diverse healthcare provider footprint, along with industry proven expert insights. We are dedicated to simplifying the complicated healthcare system with next-generation technology and solutions, transforming healthcare from the point-of-patient care to everyday life. For more information, please explore www.veradigm.com . What will your job look like: Job Summary: We are seeking an experienced Test Automation Engineer with over 5 years of experience in designing, developing, and executing automated test scripts for complex applications. The ideal candidate will be skilled in both UI and API automation, test framework development, and continuous integration practices, ensuring high-quality software delivery. Key Responsibilities: Design, develop, and maintain automated test frameworks and scripts for web, mobile, and API-based applications . Collaborate with developers, QA analysts, and DevOps engineers to ensure end-to-end quality assurance. Develop and execute automated regression, functional, integration, and performance tests . Maintain test data, environments, and version control of test artifacts. Integrate automated tests into CI/CD pipelines using tools like Jenkins, Azure DevOps, or GitLab . Identify, log, and track bugs, and work closely with developers for quick resolution. Participate in Agile ceremonies and contribute to estimation and planning. Continuously evaluate test tools, technologies, and practices to improve test efficiency and coverage. An Ideal Candidate will have: Required Skills & Qualifications: 5+ years of hands-on experience in test automation for web and/or enterprise applications. Strong expertise in tools like Selenium, Playwright, Cypress, Postman, REST Assured , or equivalent. Proficiency in one or more programming languages: C#, Java, JavaScript, Python . Experience in testing REST APIs , microservices, and backend components. Advanced proficiency in SQL and performance tuning. Familiarity with BDD frameworks such as SpecFlow, Cucumber, or Behave . Hands-on experience with CI/CD tools like Jenkins, Azure DevOps, GitHub Actions , etc. Good understanding of software development lifecycle (SDLC) and Agile methodologies . Preferred Qualifications: Experience with performance testing tools such as JMeter, LoadRunner , or k6 . Exposure to cloud platforms like Azure , AWS , or Google Cloud . Familiarity with containerized environments (Docker, Kubernetes) and test orchestration. ISTQB or equivalent certification in software testing. Benefits Veradigm believes in empowering our associates with the tools and flexibility to bring the best version of themselves to work. Through our generous benefits package with an emphasis on work/life balance, we give our employees the opportunity to allow their careers to flourish. Quarterly Company-Wide Recharge Days Flexible Work Environment (Remote/Hybrid Options) Peer-based incentive Cheer awards All in to Win bonus Program Tuition Reimbursement Program To know more about the benefits and culture at Veradigm, please visit the links mentioned below: - https: / / veradigm.com / about-veradigm / careers / benefits / https: / / veradigm.com / about-veradigm / careers / culture / #LI-SM1 #LI-REMOTE Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill setIf so, please scroll down and tell us more about yourself!
Posted 1 week ago
5.0 - 8.0 years
7 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Veradigm Veradigm is here to transform health, insightfully. Veradigm delivers a unique combination of point-of-care clinical and financial solutions, a commitment to open interoperability, a large and diverse healthcare provider footprint, along with industry proven expert insights. We are dedicated to simplifying the complicated healthcare system with next-generation technology and solutions, transforming healthcare from the point-of-patient care to everyday life. For more information, please explore What will your job look like: Job Summary: We are seeking a detail-oriented and experienced Database and Backend Test Engineer with 5+ years of experience in testing large-scale data platforms , including Snowflake , Azure Data Services , and backend services. The ideal candidate will be responsible for validating data pipelines, backend logic, stored procedures, and integrations, ensuring the accuracy, performance, and quality of enterprise data systems. Key Responsibilities: Design and implement test strategies for backend systems and data pipelines across Snowflake and Azure environments. Write and execute complex SQL queries to validate transformations, stored procedures, and data quality. Perform ETL testing , data reconciliation, schema validation, and metadata checks. Collaborate with data engineers and developers to verify pipeline performance, reliability, and scalability. Build and maintain automated test scripts using tools like pytest, dbt, or custom SQL-based frameworks . Integrate database tests into CI/CD pipelines using tools such as Azure DevOps , GitHub Actions, or Jenkins. Perform root cause analysis on data issues and communicate findings with relevant teams. Monitor and validate data processing jobs and schedule validations using Azure Data Factory , Synapse , or Data Bricks . Document test scenarios, data sets, and validation logs in a structured manner. An Ideal Candidate will have: Required Skills & Qualifications: 5+ years of experience in database and backend testing . Strong hands-on experience with Snowflake - including data modeling, querying, and security roles. Experience with Azure data tools such as Azure SQL , Data Factory , Synapse Analytics , or Data Lake . Advanced proficiency in SQL and performance tuning. Experience with ETL/ELT testing and validation of data migration or transformation logic. Familiarity with Python or Shell scripting for data test automation. Knowledge of CI/CD integration for test automation. Strong understanding of data quality frameworks , data governance , and test reporting . Preferred Qualifications: Experience with dbt , Great Expectations , or other data validation tools. Exposure to cloud storage validation (Azure Blob, ADLS). Experience in testing APIs for data services or backend integrations. Knowledge of data privacy and compliance frameworks (e.g., GDPR, HIPAA).
Posted 1 week ago
6.0 - 10.0 years
8 - 13 Lacs
Gurugram
Work from Office
Key Accountabilities Provide technical support and expertise on EPM Oracle Cloud (EPBCS ,EPCMCS , FCCS and ARCS) applications. Maintains strong and trusted relationships with stakeholders to coordinate timely and accurate delivery of system enhancements and routine month end /quarter end tasks. Coach and develop team members. Lead and review the work of staff by providing mentoring, technical guidance and/or training in daily activities. Provide technical expertise in evaluating new features on EPM cloud. Assist in the testing and migration to production of software upgrades. Primary contact for the SOX/SOC1 audits requests from internal and external auditors. Performance tuning and monitoring of the EPM cloud applications. Assist in new project requirements. Participate in testing, implementation in QA environment & migration of new processes to production. Review existing process & new process for automation. Recommend process improvements along with the implementation. Ensure documentation of key close processes. Education Bachelors or master s degree in a relevant field of work or an equivalent combination of education and work-related experience. Experience 8+ years progressive work-related experience in systems administration and programming with demonstrated mastery of technical and business knowledge and understanding of multiple discipline/processes related to the position. 4+ experience in managing team and independently managing key deliverables. Technical Skills and Knowledge Working experience of supporting on Oracle EPM cloud applications. Demonstrated knowledge of Hyperion Essbase (ideally System 11), Shared Services (Security) and Financial Reporting. Demonstrated understanding of ETL tools (ideally Data management), RDBMS and Multi-dimensional databases. Demonstrated understanding of basic Unix shell & windows batch scripting. Demonstrated experience in resource management and team performance management. Key Accountabilities Provide technical support and expertise on EPM Oracle Cloud (EPBCS ,EPCMCS , FCCS and ARCS) applications. Maintains strong and trusted relationships with stakeholders to coordinate timely and accurate delivery of system enhancements and routine month end /quarter end tasks. Coach and develop team members. Lead and review the work of staff by providing mentoring, technical guidance and/or training in daily activities. Provide technical expertise in evaluating new features on EPM cloud. Assist in the testing and migration to production of software upgrades. Primary contact for the SOX/SOC1 audits requests from internal and external auditors. Performance tuning and monitoring of the EPM cloud applications. Assist in new project requirements. Participate in testing, implementation in QA environment & migration of new processes to production. Review existing process & new process for automation. Recommend process improvements along with the implementation. Ensure documentation of key close processes. Education Bachelors or master s degree in a relevant field of work or an equivalent combination of education and work-related experience. Experience 8+ years progressive work-related experience in systems administration and programming with demonstrated mastery of technical and business knowledge and understanding of multiple discipline/processes related to the position. 4+ experience in managing team and independently managing key deliverables. Technical Skills and Knowledge Working experience of supporting on Oracle EPM cloud applications. Demonstrated knowledge of Hyperion Essbase (ideally System 11), Shared Services (Security) and Financial Reporting. Demonstrated understanding of ETL tools (ideally Data management), RDBMS and Multi-dimensional databases. Demonstrated understanding of basic Unix shell & windows batch scripting. Demonstrated experience in resource management and team performance management.
Posted 1 week ago
0.0 - 1.0 years
0 Lacs
Kochi
Work from Office
Assist in the design, development, and maintenance of Oracle database components. Write and optimize SQL queries, stored procedures, and functions. Support data migration, data cleansing, and performance tuning efforts. Collaborate with developers and analysts to implement database-driven solutions. Troubleshoot issues and assist in resolving database-related problems. Document technical processes, queries, and configurations as required. Pursuing or recently completed a degree in Computer Science, Information Technology, or related field. Good understanding of relational database concepts and SQL. Familiarity with Oracle Database and PL/SQL. Ability to write and debug SQL queries effectively. Basic understanding of database performance and indexing. Strong analytical and problem-solving skills
Posted 1 week ago
12.0 - 15.0 years
45 - 50 Lacs
Gurugram
Work from Office
Database Architect About Junglee Games: With over 140 million users, Junglee Games is a leader in the online skill gaming space. Founded in San Francisco in 2012 and part of the Flutter Entertainment Group, we are revolutionizing how people play games. Our notable games include Howzat, Junglee Rummy, and Junglee Poker. Our team comprises over 900 talented individuals who have worked on internationally acclaimed AAA titles like Transformers and Star Wars: The Old Republic and contributed to Hollywood hits such as Avatar. Junglee s mission is to build entertainment for millions of people around the world and connect them through games. Junglee Games is not just a gaming company but a blend of innovation, data science, cutting-edge tech, and, most importantly, a values-driven culture that is creating the next set of conscious leaders. Job overview: As our Database Architect you will play a pivotal role in leading the design, optimization, and management of our organization s database systems. The ideal candidate will have a strong background in MS SQL, MongoDB, PostgreSQL, and Redis , with a proven track record in data modeling, performance tuning, and scaling large and complex database architectures across multiple platforms. Job Location Bangalore / Gurgaon Key responsibilities: Design and implement scalable, secure, and highly available database architectures. Lead the evaluation and selection of appropriate database solutions and tools based on business requirements. Create and maintain data models, data flow diagrams, and technical documentation. Optimize database performance, storage, and query efficiency through indexing, caching, and best practices. Implement database security policies, backup and recovery procedures, and high availability solutions. Collaborate with DevOps, backend, and application teams to support data access patterns and integration. Guide and mentor junior DBAs and database engineers. Monitor and troubleshoot production databases, proactively identifying and resolving issues. Drive modernization efforts including cloud database migrations and adoption of newer technologies. Qualifications & skills required: 12-15 years of experience in database architecture and engineering roles. Expertise in MS SQL Server , including complex stored procedures, indexing strategies, replication, and clustering. Hands-on experience with MongoDB in high-throughput and distributed environments. Deep understanding of PostgreSQL internals, extensions, and performance tuning. Solid working knowledge of Redis for caching and real-time data processing. Strong understanding of database security, disaster recovery, and data lifecycle management. Experience with cloud platforms (AWS, Azure, GCP) and database-as-a-service offerings is a plus. Proficient in data modeling tools and techniques (ERD, normalization, etc.). Excellent problem-solving skills and ability to handle pressure in high-availability environments. Strong communication skills and ability to present solutions to both technical and non-technical stakeholders. Be a part of Junglee Games to: Value Customers & Data - Prioritize customers, use data-driven decisions, master KPIs, and leverage ideation and A/B testing to drive impactful outcomes. Inspire Extreme Ownership - We embrace ownership, collaborate effectively, and take pride in every detail to ensure every game becomes a smashing success. Lead with Love - We reject micromanagement and fear, fostering open dialogue, mutual growth, and a fearless yet responsible work ethic. Embrace change - Change drives progress and our strength lies in adapting swiftly and recognizing when to evolve to stay ahead. Play the Big Game - We think big, challenge norms, and innovate boldly, driving impactful results through fresh ideas and inventive problem-solving. Avail a comprehensive benefits package that includes paid gift coupons, fitness plans, gadget allowances, fuel costs, family healthcare, and much more. Know more about us Explore the world of Junglee Games through our website, www.jungleegames.com . Get a glimpse of what Life at Junglee Games looks like on LinkedIn . Here is a quick snippet of the Junglee Games Offsite 24 Liked what you saw so far? Be A Junglee
Posted 1 week ago
6.0 - 9.0 years
8 - 11 Lacs
Chennai
Work from Office
Job Title: Data Modeller - GCP Experience: 6-9 Years Work Type: On-site Work Location: Chennai (Work from Client Office - Mandatory) Job Description We are seeking a skilled Data Modeller with strong experience in data modelling for OLTP and OLAP systems, particularly within Google Cloud Platform (GCP). The ideal candidate will be hands-on with designing efficient, scalable data architectures and have a solid grasp of performance tuning and cloud-based databases. Key Responsibilities: Design and implement Conceptual, Logical, and Physical Data Models for OLTP and OLAP systems Apply best practices in data indexing, partitioning, and sharding for optimized performance Use data modelling tools (preferably DBSchema) to support and document database design Ensure data architecture supports near real-time reporting and application performance Collaborate with cross-functional teams to translate business requirements into data structures Work with GCP database technologies like AlloyDB, CloudSQL, and BigQuery Validate and improve database performance metrics through continuous optimization Must-Have Skills: GCP: AlloyDB, CloudSQL, BigQuery Strong hands-on experience with data modelling tools (DBSchema preferred) Expertise in OLTP & OLAP data models, indexing, partitioning, and data sharding Deep understanding of database performance tuning and system architecture Good to Have: Functional knowledge of the mutual fund industry Exposure to data governance and security best practices in the cloud
Posted 1 week ago
4.0 - 8.0 years
6 - 10 Lacs
Pune, Bengaluru
Work from Office
Data Modeller Job ID: Dat-ETP-Ban-1036 Location: Bangalore,Pune,India,Other Company Overview Bridgenext is a Global consulting company that provides technology-empowered business solutions for world-class organizations. Our Global Workforce of over 800 consultants provides best-in-class services to our clients to realize their digital transformation journey. Our clients span the emerging, mid-market and enterprise space. With multiple offices worldwide, we are uniquely positioned to deliver digital solutions to our clients leveraging Microsoft, Java, and Open Source with a focus on Mobility, Cloud, Data Engineering, and Intelligent Automation. Bridgenext s singular mission is to create Clients for Life long-term relationships that deliver rapid, meaningful, and lasting business value. At Bridgenext, we have a unique blend of Corporate and Entrepreneurial cultures. This is where you would have an opportunity to drive business value for clients while you innovate and continue to grow and have fun while doing it. You would work with team members who are vibrant, smart, and passionate and they bring their passion to all that they do whether it s learning, giving back to our communities, or always going the extra mile for our client. Position Description We are looking for a data modeler with hands-on Snowflake experience who will work on the internal and customer-based projects for Bridgenext. We are looking for someone who cares about the quality of the code and who is passionate about providing the best solution to meet the client s needs and anticipates their future needs based on an understanding of the market. Someone who worked on Snowflake projects including data modeling with Snowflake, Azure. Must Have Skills: 4-8 years of overall experience 4 years experience in designing, implementing, and documenting data architecture and data modeling solutions, which include the use of Azure SQL and Snowflake databases and SQL procedures. Knowledge of relational databases and data architecture computer systems, including SQL Be responsible for the development of conceptual, logical, and physical data models, the implementation of operational data store (ODS), data marts, and data lakes on target platforms (Azure SQL and Snowflake databases). Knowledge of ER modeling, big data, enterprise data, and physical data models Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have a strong knowledge of Data Quality and Data Governance. Must have knowledge of ETL. Professional Skills: Solid written, verbal, and presentation communication skills. Strong team and individual player. Maintains composure during all types of situations and is collaborative by nature. High standards of professionalism, consistently producing high-quality results. Self-sufficient, independent requiring little supervision or intervention. Demonstrate flexibility and openness to bring creative solutions to address issues.
Posted 1 week ago
4.0 - 5.0 years
6 - 7 Lacs
Hyderabad
Work from Office
Key Responsibilities: Design, build, and maintain dynamic and user-friendly dashboards using Tableau and Looker Develop and optimize LookML models in Looker and manage data exploration layers for reusable insights Use Tableau Prep and Looker data modelling to cleanse, shape, and transform data Collaborate with data engineers, analysts, and business stakeholders to understand reporting requirements and deliver relevant visualizations Ensure data accuracy, performance tuning, and alignment with business KPIs and metrics Maintain version control, content organization, and user access in Tableau Server and Looker environments Implement best practices in visualization design, layout, interactivity, and storytelling Support the migration and redesign of legacy dashboards (e.g., from Power BI or Qlik) into Tableau or Looker as required Automate reporting workflows and enable self-service BI through guided dashboards Collaborate with IT or data platform teams to troubleshoot connectivity, access, and integration issues Requirements 4-5 years of hands-on experience in data visualisation using Tableau and Looker Strong experience with LookML , dashboard development , and explore setup in Looker Expertise in Tableau Desktop and Tableau Server/Public , with solid knowledge of calculated fields, parameters, filters, and Tableau dashboards Good understanding of data modelling , SQL , and working with large, complex datasets (preferably in Snowflake , BigQuery , or SQL Server ) Familiarity with data governance , permissions management , and user licensing in Tableau and Looker Ability to apply design principles, storytelling techniques, and UX best practices in dashboards Strong communication and stakeholder management skills Bonus: Experience with Power BI , Power Automate , or migration projects from other BI platforms ","Experience":"4-5 years","
Posted 1 week ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title: Senior FullStack Developer (React.js + Node.js) Experience: 78 Years Location: Bangalore Notice Period: ImmediateJoiners Only Job Overview: We are looking for a highly skilled and experienced SeniorFull Stack Developer with a strong command of React.js and Node.js to join our dynamic team. This role requires a seasoned developer who can takeownership of features end-to-end, work independently as well ascollaboratively, and demonstrate strong problem-solving abilities. Key Responsibilities: Develop and maintain high-performance, scalable web applications using React.js and Node.js Design and implement responsive UI components using modern JavaScript, HTML, and CSS Work with state management tools such as Redux or Flux Develop and test user-facing features across multiple browsers and devices Utilize iframe and other frontend integration methods when required Conduct unit testing and assist in system testing and debugging Use browser-based tools to optimize application performance Collaborate with cross-functional teams to define, design, and deliver new features Required Skills: Strong hands-on experience with React.js and Node.js Proficient in JavaScript , HTML , CSS , and modern frontend practices Deep understanding of Redux , Flux , and related React tools Practical experience with iframes and embedding techniques Strong knowledge of UI/UX principles and design standards Experience with unit testing frameworks Familiarity with browser-based debugging tools and performance tuning Excellent troubleshooting and problem-solving skills
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Experience 5-10 Number of openings 3 What awaits you/ Job Profile We are seeking a reliable and technically proficient Infrastructure & Operations Engineer to manage the maintenance, operations, and monitoring of 3DEXPERIENCE (3DX) and CATIA environments. This role ensures high availability, performance, and security of PLM and CAD systems, supporting engineering and product development teams. What should you bring along Maintenance & Operations: Perform routine maintenance, patching, and upgrades of 3DX and CATIA environments. Ensure system uptime, performance tuning, and capacity planning . Manage user provisioning, license administration, and access control . Maintain environment documentation , SOPs, and change logs. Infrastructure Management: Administer servers (Windows/Linux), databases (Oracle/SQL Server), and application servers ( Tomcat / WebLogic ) hosting 3DX/CATIA. Collaborate with IT and network teams to ensure infrastructure scalability, security & licensing policies. Support backup, disaster recovery, and business continuity planning. Maintain and automate deployment scripts or tools for client installation and configuration (e.g., silent installs, PowerShell). Monitoring & Support: Implement and manage monitoring tools ( Nagios, SCCOM , Zabbix , Splunk , Dynatrace ) for proactive issue detection. Monitor system logs, performance metrics, and alerts to ensure system health . Troubleshoot and resolve infrastructure-related incidents and service requests. Provide L2/L3 support for CATIA users and coordinate with Dassault Syst mes support when needed. Administer and monitor license servers, analyze usage, and forecast licensing needs. Must have technical skill Bachelor s degree in information technology, Computer Science, or related field. Hands-on experience with 3DEXPERIENCE platform and CATIA V5/V6. Proficiency in system administration (Windows/Linux), networking, and database management. Familiarity with monitoring tools and ITSM platforms (e.g., ServiceNow, BMC Remedy). Familiarity in CATIA Data Migration between different releases / versions . Good to have technical skills Dassault Syst mes certifications in CATIA or 3DEXPERIENCE. Experience with virtualization (VMware), cloud platforms (AWS, Azure), and containerization (Docker, Kubernetes). Knowledge of scripting (Shell, PowerShell, Python) for automation. ITIL Foundation certification. Support integrations between CATIA and other enterprise tools (e.g., PDM/PLM, simulation tools).
Posted 1 week ago
4.0 - 6.0 years
6 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES Provides full design, planning, configuration, documentation, deployment and top-level support ownership of storage infrastructure technologies. Identifies design requirements and makes recommendations for capacity planning, performance optimization and future direction. Designs storage solutions per business requirements. This includes performing storage workload modeling for sizing, optimization and troubleshooting. Researches and compares system/OS features and works with vendors on system sizing for specific applications. Understands storage virtualization, data rationalization, workload automation, storage provisioning, Disaster Recovery and SAN Fabric management. Troubleshoots storage-related reliability, availability, and performance issues. Collaborates on and implements architecture recommendations to application integration, system administration, problem management, preventive maintenance, performance tuning. Identifies and eliminates performance bottlenecks and makes performance-related recommendations (hardware, software, configuration). Leads or participates in the software development lifecycle, which includes research, new development, modification, security, correction of errors, reuse, re-engineering and maintenance of software products. Manages or utilizes software that is built and implemented as a product, using best-in-class development process/lifecycle management (ex: Agile, Waterfall). Gathers business requirements and participates in product definition and feature prioritization, including customer usability studies. Performs competitive analysis for features at a product level scope. Leads the testing and fixing of new or enhanced products. Creates technical documentation of software products/solutions. Assists with the development and review of end user and technical end user documentation. Drives idea generation for new software products, or for the next version of an existing product. Protects Intellectual property by working appropriate legal elements (ex: procurement, patents, open source). Responsible for the delivery of products within budget, schedule and quality guidelines. Works with the team to develop, maintain, and communicate current development schedules, timelines and development status. Makes changes to system software to correct errors in the original implementation and creates extensions to existing programs to add new features or performance improvements. Designs and develops major functional or performance enhancements for existing products, or produces new software products or tools. Reviews requirements, specifications and designs to assure product quality; develops and implements plans and tests for product quality or performance assurance. RESPONSIBILITIES Participates in the preparation, review and analysis of software/storage requirements and specifications Prepares written specifications from verbal requirements for tasks of mid-level complexity Prepares design, functional, technical and/or user documentation, as needed Uses defined software lifecycle methodologies Reviews and implements test strategies for software products Follows source code and file revision control for projects Clearly communicates project issues and status Accurately logs project schedule, defect, and other data Analyzes and prepares trend reports on quality metrics Participates in improving product quality through process and procedure improvements Description Comments Additional Details Description Comments : Essential Requirements- Familiar in C/C++ programming , Linux programming, OS Internals, memory management, IPC, thread programming and application software development. Good basics in data structures, multi-threading, IPC, socket programming. Ability to code/debug programs Knowledge on Automation languages - Python Frameworks or similar Awareness and exposure on testing methodologies. Not to Exceed Rate : (No Value)
Posted 1 week ago
5.0 - 8.0 years
8 - 13 Lacs
Chennai
Work from Office
The Agilysys Hospitality Cloud combines core operational systems for property management (PMS), point-of-sale (POS) and Inventory and Procurement (IP) with Experience Enhancers that meaningfully improve interactions for guests and for employees across dimensions such as digital access, mobile convenience, self-service control, personal choice, payment options, service coverage and real-time insights to improve decisions. Core solutions and Experience Enhancers are selectively combined in Hospitality Solution Studios tailored to specific hospitality settings and business needs. Agilysys operates across the Americas, Europe, the Middle East, Africa, Asia-Pacific, and India with headquarters located in Alpharetta, GA. For more information visitAgilysys.com. visit Agilysys.com. Requirement Responsibilty : Proficiency in MongoDB Data Modeling Strong experience with MongoDB Query Index Tuning Experience with MongoDB Sharding Replication Troubleshooting MongoDB bottlenecks State-of-the-art MongoDB performance tuning capabilities Respond to incidents and ability to bring them to closure Ensure that the databases achieve maximum performance and availability Recommend and implement best practice Passion for troubleshooting the toughest problems and propose creative solutions Desired Experience : Hospitality Experience. mongo,Atlas
Posted 1 week ago
3.0 - 6.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Design, develop, and deploy machine learning models for various use cases (classification, regression, NLP, CV, etc.) Preprocess, clean, and transform large datasets for training and testing Conduct model evaluation, performance tuning, and error analysis Collaborate with data engineers, product managers, and business stakeholders to deliver intelligent solutions Implement and manage scalable ML pipelines using tools like MLflow, Airflow, or Kubeflow Work with cloud platforms such as AWS, Azure, or GCP for model training and deployment Stay updated with the latest advancements in AI/ML technologies and frameworks Strong programming skills in Python and libraries such as Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch Experience with model deployment and MLOps practices Good understanding of data structures, algorithms, and statistics Hands-on experience with cloud platforms (AWS/GCP/Azure) Proficiency in SQL and working with databases/data warehouses Knowledge of Docker/Kubernetes is a plus Excellent problem-solving and communication skills
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
We are looking for an experienced MuleSoft Developer to join our integration team and contribute to building scalable, secure, and robust integration solutions for the insurance industry . The ideal candidate should possess strong MuleSoft development expertise , a solid understanding of insurance domain processes , and effective communication and collaboration skills . Key Responsibilities: Design, develop, and implement integration flows using MuleSoft Anypoint Platform , including APIs, connectors, and transformations. Collaborate with business analysts, architects, and other stakeholders to understand insurance-specific requirements and translate them into technical solutions. Develop and manage API-led connectivity and event-driven integration patterns within the insurance ecosystem. Perform unit testing, integration testing, and support UAT. Ensure code quality through code reviews , version control , and CI/CD best practices . Maintain and enhance existing MuleSoft applications to meet evolving business needs. Participate in sprint planning, daily stand-ups, and retrospectives. Create documentation for technical designs, API specifications, and deployment processes. Technical Skills Required: 3+ years of hands-on experience in MuleSoft development (Mule 4 preferred). Expertise in designing RAML-based APIs , DataWeave transformations, and Mule Connectors . Strong understanding of integration patterns , error handling , security , and performance tuning in Mule. Experience working with insurance core systems (e.g., Policy Admin, Claims, Underwriting, CRM) is a plus. Familiarity with cloud platforms (e.g., Azure, AWS) and API gateways. Knowledge of CI/CD tools (Jenkins, GitLab), version control (Git), and automated testing frameworks. Experience with message queues , Kafka , JMS , and relational and NoSQL databases . Non-Technical Skills: Strong understanding of insurance business processes , terminology, and workflows. Excellent verbal and written communication skills for working with cross-functional teams and clients. Analytical thinking with the ability to troubleshoot and resolve complex integration issues. Ability to work in a fast-paced, Agile environment . Strong documentation and stakeholder presentation skills.
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Bengaluru
Work from Office
Data Engineer - Bangalore Notified is hiring a Data Engineer in Bangalore. Job Summary: We are seeking a highly skilled and experienced Engineer with 6-8 years of expertise in Informatica application integration, data integration, ETL processes, and SQL. The ideal candidate will be an exceptional problem solver, possess outstanding leadership and communication skills, and have a solid understanding of real-time data synchronization. Experience with CRM, ITSM data, and knowledge of Tableau/PowerBI is a definite plus. Essential Duties: Develop and implement complex ETL processes utilizing Informatica Intelligent Cloud Services (IICS) to efficiently extract, transform, and load data across multiple sources and destinations. Demonstrated expertise in data warehousing concepts, techniques, and best practices to design and maintain scalable, high-performing data models and schemas. Collaborate with cross-functional teams to gather project requirements, develop ETL specifications, and ensure the successful execution of data integration strategies. Ensure real-time data synchronization to optimize system performance and enhance data-driven decision-making processes. Ability to work with various data warehouses is key (Snowflake, MSSQL, MySQL, BigQuery) Perform ongoing platform administration tasks, including user management, data security, performance tuning, and system upgrades, to ensure optimal performance and data reliability. Implement data quality and validation procedures, including data pipeline error monitoring and issue resolution, to maintain the integrity of critical data assets. Continuously explore and adapt to new technologies and methodologies in the data engineering space, fostering a culture of innovation and continuous improvement within the team. Troubleshoot and resolve integration issues and incidents within the Informatica platform while maintaining the highest level of data quality and integrity. Adopt a proactive approach in maintaining and optimizing existing ETL processes, identifying areas for improvement, and aligning them with the organizations strategic goals. Aid senior staff when conducting complex data analysis to resolve management inquiries. Stay up to date with industry trends and best practices to ensure technical expertise and drive innovation within the team. Design and participate in a comprehensive and logical test plan for systems using appropriate tools to ensure established standards are utilized. Escalate to supervisor any situation outside the employees control that could adversely impact the services being provided Participation in code reviews and contribute to establishing best practices in data engineering along with maintaining documentation, ensuring accuracy and consistency across the organization. Minimum Qualifications: Mandatory 6-8 years of experience in Informatica application integration (IICS), data integration, ETL jobs, Snowflake, and SQL, preferably in large and complex data environments. Strong knowledge of Informatica PowerCenter, IDQ, and IICS. 3-4 years experience in SQL query writing, building data warehousing models, data marts, stored procedures & views for front-end reporting. Minimum 2 years of experience with data analysis to include experience in the analysis or design of applications or systems to store and extract data. Bachelor s degree in computer science, Engineering, or related field (master s degree preferred). Good to have Basic knowledge of Snowflake DB is good to have. Proven experience in resolving data integration issues, ETL performance tuning, troubleshooting, and optimization. Knowledge of Salesforce, ServiceNow or ConnectWise data, and real-time data synchronization is good to have. Experience in working with Tableau/PowerBI & Snowflake, BiqQuery tools is a plus. Strong leadership, communication, and problem-solving skills. Ability to work independently and as part of a team, managing multiple projects simultaneously. This role will be based out of The Leela Office located on the 4th Floor, Airport Road, Kodihalli, Bangalore- 560008. Our expectation at this time, is that you would work HYBRID - work from our office on Tuesdays, Wednesdays, Thursdays with flexibility to work from home on Mondays and Fridays. Bangalore candidates preferred. The working hours are 2 PM to 11 PM IST. ABOUT NOTIFIED Our products are built so storytellers can do their best work. But we re not just a platform personalized, caring service is how we operate. We add a personal touch to everything we do. We strive to deliver wisdom and insight by helping our clients reach global and targeted audiences, measure outcomes, and fulfill their commitments. CULTURE AND BENEFITS At Notified, we aim to help our employees and their families maintain a healthy work/life balance and build a financially secure future. Self-development and learning are key with all our global employees having access to our internal learning and development university DevelopU for career and skills enhancement. EXAMPLE OFFERINGS: International work environment - we have offices in 17 countries Opportunities for innovation and creativity Hybrid work schedule (office/home) Comprehensive health insurance with localized options Extensive learning opportunities via our in-house virtual university with >8,000 online courses, videos, business books and certification preps Location-specific social outings and company events with amazing colleagues, such as laser tag, board game night, and company-wide trivia night At Notified we dont just accept difference - we celebrate it, support it, and build success upon it. We are proud to be an equal opportunities employer and no part of this advertisement is intended to discriminate on any grounds.
Posted 1 week ago
7.0 - 12.0 years
9 - 14 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Summary: We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: AWS, Python, SQL, Airflow, DBT Must have done 1 or 2 projects in Clinical Domain/Clinical Industry. Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications
Posted 1 week ago
5.0 - 7.0 years
3 - 7 Lacs
Ahmedabad
Work from Office
SAP ABAP | UI5 | FIORI| BTP Consultant | Gitakshmi Careers | Reinvent Your World SAP ABAP | UI5 | FIORI| BTP Consultant Responsibilities: 5 to 7 years of hands-on experience in SAP HANA development, including native HANA objects (tables, views, stored procedures, etc.). The candidate is expected to play the role of an expert level Technical architect and leader covering the necessary architectural best practices and technology options in large SAP digital transformation programs leveraging SAPUI5, Fiori or RAP or CAPM with NodeJS/Java Springboot, Odata BTP, Build Process Automation, BTP Workzone skillsets. Strong experience in performance tuning and optimization of SAP HANA database objects. Proficiency in SAP BusinessObjects Data Services (BODS) for data extraction, transformation, and loading (ETL). Strong knowledge of data modeling techniques and data integration using BODS. Experience with HANA tools such as HANA Studio and HANA Cockpit for development and administration tasks. Experience with HANA XS and SAP HANA Cloud is a plus. Familiarity with integrating HANA with other SAP systems like SAP S/4HANA, SAP BW, and third-party data sources. SAP extensive experience including SAPUI5, Fiori or RAP or CAPM with NodeJS/Java Springboot, Odata BTP, Build Process Automation, BTP Workzone, SAP ABAP HANA. The candidate should have good experience across large SAP implementations, upgrade, development Migration projects across industries, and should be experienced in SAP S/4 HANA offerings. Candidate should have experience in latest offerings of SAP such as BTP services, like Business application Studio, Integration strategies, SAP Build Process automation and development experience in SAPUI5, Fiori, CAPM or RAP. Enforce SAP coding standards, performance optimization techniques, and security compliance. Implement ABAP Test Cockpit (ATC) and other SAP quality tools for code validation. As a Technical Architect, lead the engagement efforts at different stages from problem definition to diagnosis to technical design, development deployment in large S/4 HANA transformation programs. Connect with senior client business and IT stakeholders, demonstrating thought leadership in New offerings of SAP technology like S/4 HANA, BTP and guiding them on end to end SAP architecture including SaaS and On premise components. Excellent troubleshooting, debugging, and performance optimization skills. Ability to create technical documentation and collaborate with cross-functional teams. Strong understanding of SAPs security framework and data governance best practices. Experience working on SAP Build Portfolio including SAP Build Process Automation, Build Apps Build Workzone. Should possess experience in SAP BTP Administration BTP Account setup. Technical Expertise Responsibilities Lead the entire Development cycle SAP S/4HANA Embedded Analytics Qualify the customer needs Knowledge of SAP CAP, RAP and BTP Experience with SPAU SPDD Bachelor Degree or Higher experience in Fiori, BTP and integration Perfect written English Highly creative and autonomous Nice to have Experience Rise with SAP Implementation Strong analytical skills Whats great in the job Great team of smart people, in a friendly and open culture No dumb managers, no stupid tools to use, no rigid working hours No waste of time in enterprise processes, real responsibilities and autonomy Expand your knowledge of various business industries Create content that will help our users on a daily basis Real responsibilities and challenges in a fast evolving company Discover our products. What We Offer Each employee has a chance to see the impact of his work. You can make a real contribution to the success of the company. Several activities are often organized all over the year, such as weekly sports sessions, team building events, monthly drink, and much more A full-time position Attractive salary package. 12 days / year, including 6 of your choice. Play any sport with colleagues, the bill is covered. We use cookies to provide you a better user experience on this website.
Posted 1 week ago
4.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Company Overview: D2KSS Software Services Private Limited is a leading software development company based in Hyderabad, India. We specialize in delivering innovative software solutions to clients across various sectors. Our mission is to leverage cutting-edge technology and agile methodologies to drive success for our clients. We are committed to fostering a culture of collaboration, creativity, and continuous learning, where our employees can thrive and contribute to impactful projects. Role Responsibilities: Design, develop, and maintain high-quality Angular applications. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code following best practices. Implement responsive web design to ensure user satisfaction across devices. Optimize applications for maximum speed and scalability. Conduct code reviews to ensure quality and share knowledge with team members. Debug and resolve issues to improve application performance. Stay updated with the latest trends and technologies in Angular development. Create and maintain documentation for developed applications. Participate in daily stand-ups and sprint planning meetings. Integrate with back-end services and APIs. Work on performance tuning, improvement, and enhancement of existing applications. Collaborate with UI/UX designers to improve the usability of applications. Assist in the training and mentoring of junior developers. Follow agile methodologies for development processes. Qualifications: 4 to 7 years of experience in Angular development. Strong knowledge of TypeScript and JavaScript. Proficiency in HTML5, CSS3, and responsive design. Experience with RESTful APIs and integration. Familiarity with unit testing frameworks (e.g., Jasmine, Mocha). Understanding of version control systems (e.g., Git). Ability to work independently and in a team environment. Excellent problem-solving skills. Strong communication and collaboration abilities. Experience with agile development practices. Bachelors degree in Computer Science or a related field. Experience with build tools (e.g., Webpack, Gulp) is a plus. Familiarity with front-end testing tools (e.g., Protractor) is advantageous. Ability to adapt to new technologies quickly. Strong attention to detail and a passion for delivering exceptional user experiences.
Posted 1 week ago
10.0 - 15.0 years
40 - 45 Lacs
Bengaluru
Work from Office
Title: SAP S/4HANA Technical Lead - ABAP & Fiori About GlobalFoundries Introduction: This is a Solution delivery lead role for SAP S/4HANA - ABAP & Fiori based out of Global Foundries office in Bangalore, India and requires a mix of technical expertise, leadership abilities, team management skills and business acumen to lead development strategy, solution design in collaboration with implementation partners, GFIT and business users. Here are some key responsibilities and qualifications for this role: Responsibilities : Lead S/4HANA ABAP & Fiori development strategy & solution design to ensure quality, consistency, and efficiency. Conduct code reviews, provide technical guidance, and mentor team members. Ensure adherence to SAP best practices and coding standards. Collaborate with functional teams to understand business requirements and translate them into technical solutions - SAP ABAP RICEFW objects. Implement enhancements using BAdIs, user exits, and enhancement spots. Develop and support SAP Fiori applications, including custom apps and templates. Implement and optimize OData services and SAP Gateway configurations. Develop and maintain ALE/IDoc, RFCs, and BAPI-based integrations. Guide team in optimization of existing ABAP programs for performance. Stay updated with the latest SAP technologies and trends. Other Responsibilities: Perform all activities in a safe and responsible manner and support all Environmental, Health, Safety & Security requirements and programs Required Qualifications: Bachelors degree in computer science, IT, or a related field. Proven experience in SAP ABAP development with overall 10 years min. exp with at least 5 years of lead role esp. for S/4HANA. Strong knowledge of SAP modules and integration points. Proficiency in Object-Oriented ABAP, Enhancement Framework, BAPIs, BADIs, Smart Forms, Workflows and debugging tools. Proficiency in creating and consuming OData services for Fiori applications. Advanced debugging capabilities to identify and guide teams for issues resolution in both ABAP and Fiori applications. Strong analytical mindset & problem-solving skills to address technical and business challenges. Competence in resource allocation, timeline management, and task prioritization. Effective communication and leadership abilities. Hands on experience in implementing best practices using S4HANA ABAP like Code Push Down Techniques like CDS View, AMDP, S4HANA ABAP, Performance Tuning / optimization, ATC Experience working on IDEs like Eclipse, Business Application Studio etc. & code migration transport options through the landscape Experience in BTP Side by Side extensions, In App Extensions and configuration will be preferred. Relevant SAP certifications will be preferred. GlobalFoundries is an equal opportunity employer, cultivating a diverse and inclusive workforce. We believe having a multicultural workplace enhances productivity, efficiency and innovation whilst our employees feel truly respected, valued and heard. As an affirmative employer, all qualified applicants are considered for employment regardless of age, ethnicity, marital status, citizenship, race, religion, political affiliation, gender, sexual orientation and medical and/or physical abilities. All offers of employment with GlobalFoundries are conditioned upon the successful completion of background checks, medical screenings as applicable and subject to the respective local laws and regulations. Information about our benefits you can find here: https: / / gf.com / about-us / careers / opportunities-asia
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Kochi
Work from Office
Junior Azure Cloud Engineer Implement scalable, secure, and highly available Azure infrastructure. Manage and optimize Azure services including VMs, Azure Functions, App Services, Azure SQL, Storage Accounts, VNET. Monitor and maintain cloud environments using Azure Monitor, Log Analytics, and Application Insights. Migrate workloads from Azure to Azure, On-premise to Azure, Other Cloud to Azure Troubleshoot and resolve infrastructure and application-related issues in a cloud environment. Closely work with sales team and customer team for requirement gathering and deployment phases 2+ years of IT experience with at least 1+ years in Azure cloud. Understanding of Azure IaaS, PaaS, and networking services. Understanding of M365, O365 Experience in Migrating workload Experience in deploying infrastructure in Azure Experience with monitoring, alerting, and performance tuning in Azure.
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Udupi
Work from Office
CodeZyng is looking for PostgreSQL Database Engineer to join our dynamic team and embark on a rewarding career journey. PostgreSQL DBAs design and implement database schema, including tables, views, and stored procedures, to support software applications. Database installation and configuration : PostgreSQL DBAs install and configure PostgreSQL databases on servers, ensuring that they are optimized for performance and reliability. PostgreSQL DBAs optimize database performance by analyzing database queries and tuning database configuration settings. PostgreSQL DBAs implement appropriate security measures, such as encryption and access controls, to ensure that sensitive data is protected. PostgreSQL DBAs ensure that PostgreSQL databases are highly available and scalable to meet the needs of growing applications.
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs. Your primary responsibiities incude: Design, buid, optimize and support new and existing data modes and ETL processes based on our cients business requirements. Buid, depoy and manage data infrastructure that can adequatey hande the needs of a rapidy growing data driven organization. Coordinate data access and security to enabe data scientists and anaysts to easiy access to data whenever they need too. Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Design, deveop, and maintain Ab Initio graphs for extracting, transforming, and oading (ETL) data from diverse sources to various target systems. Impement data quaity and vaidation processes within Ab Initio. Data Modeing and Anaysis:. Coaborate with data architects and business anaysts to understand data requirements and transate them into effective ETL processes.. Anayze and mode data to ensure optima ETL design and performance.. Ab Initio Components:. . Utiize Ab Initio components such as Transform Functions, Roup, Join, Normaize, and others to buid scaabe and efficient data integration soutions. Impement best practices for reusabe Ab Initio component Preferred technica and professiona experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minima resource utiization. Conduct performance tuning and troubeshooting as needed. Coaboration. Work cosey with cross-functiona teams, incuding data anaysts, database administrators, and quaity assurance, to ensure seamess integration of ETL processes.. Participate in design reviews and provide technica expertise to enhance overa soution quaity Documentation
Posted 1 week ago
5.0 - 10.0 years
11 - 16 Lacs
Kolkata
Work from Office
Deveop partnerships with key stake hoders in HR to understand the strategic direction, business process, and business needs Shoud be we versed with AGILE / Scrum / Devops. Create technica soutions to meet business requirements Hep Finance business users adopt best practices Exceent Verba & written communication skis. Define user information requirements in Orace E-Business Suite Impement pans to test business and functiona processes Manage Test Scripts that support Orace R12 financia appications Lead technica acceptance testing (Unit, SIT, and QAT) of patches and upgrades Deiver training content to users. Candidate must be ready to work from office daiy and in shifts if required. NO Work From Home aowed Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Minimum of 5+ years of reevant experience in Orace Coud Appications. Exceent (2-3 projects) impementation experience in coud/fusion and 2/3 impementations in 12.2.9, Overa 8+ years of reevant experience in Orace R12. 2.9 and fusion reease 13. Reasonabe exposure on P2P modues ike iproc/PO/AP/FA/ebiz Tax/India ocaization/GL. Orace R12. 2.9 and coud Appications experience is must Orace Projects exposure is added advantage Sub edger Accounting (SLA) knowedge Preferred technica and professiona experience Orace PLSQL. DBA / Technica Skis. Performance Tuning
Posted 1 week ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
As Consutant, you are responsibe to deveop design of appication, provide reguar support/guidance to project teams on compex coding, issue resoution and execution. Your primary responsibiities incude: Lead the design and construction of new mobie soutions using the atest technoogies, aways ooking to add business vaue and meet user requirements. Strive for continuous improvements by testing the buid soution and working under an agie framework. Discover and impement the atest technoogies trends to maximize and buid creative soutions Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Tota experience of 3-5 years Mandatory Skis: LUW/Unix DB2 DBA ,She Scripting, Linux and AIX knowedge Secondary Skis: Physica DB2 DBA skis, PL/SQL Knowedge and exposure Expert knowedge of IBM DB2 V11.5 instaations, configurations administration in Linux /AIX systems. Expert eve knowedge in Database restores incuding redirected restore backup concepts Preferred technica and professiona experience Good knowedge of utiities ike import, oad export under high voume conditions. Abiity to tune SQLs using db2advisor db2expain. Abiity to troubeshoot database issues using db2diag, db2pd, db2dart, db2top tec
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane