Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
SystemsPlus is hiring for Backend developer, Exp 6 yr + Immediate joiners only. We are seeking a highly skilled BackEnd developer with expertise in AWS cloud technologies and experience in end-to-end solution implementation. The ideal candidate will be working on initiatives, provide scalable & effective solutions, and collaborate with cross-functional teams to ensure successful delivery. This role demands hands-on backend development expertise, delivering features, and strategic planning with the tech lead. Key Responsibilities: • Lead and manage the end-to-end implementation of software solutions on AWS. • Well understanding of on-premises solutions and how to migrate them to cloud based services. • Develop and solution cloud-based applications, microservices based APIs, relational databases (Postgres RDS), graph databases and technologies, ensuring scalability, performance and accuracy. • Collaborate with the tech lead to gather requirements and translate them into implementation plan and provide solutions. • Do unit and functional test and integrate added/modified feature with the flow. • Have development cycle follow to the defined SDLC standards and deployments be through CI/CD automated code release pipelines • Identify and implement orchestration/automation tools to streamline processes using AirFlow. • Ensure solutions comply with security standards and performance benchmarks. • Troubleshoot and resolve functional issues across environments. • Stay updated with AWS services and industry trends, driving continuous improvement. Required Skills and Experience: • 6+ years of experience in software development and backend development roles where the business logic implementation is the driver. • Expertise in: Python programming language to build services and business logic interacting with data (database or data lake). • Expertise in AWS services: EC2, S3, Lambda, RDS, Kinesis, IAM, API Gateway, etc. • Expertise in Oracle or Postgres: building functionalities with heavy use of those databases. • Proven experience with end-to-end project delivery and system integrations. • Strong understanding of microservices development and serverless computing. • Proficiency in programming languages: PGPL/SQL, Java, bash scripting. • Expertise in Elasticsearch or similar search engines. • Experience with CI/CD pipelines and tools like GitLab, GitHub, or AWS CodePipeline. • Experience with IaC terraform. • Solid understanding of docker, and cloud architecture best practices. • Strong problem-solving and team player skills. Preferred Qualifications: • AWS certifications (e.g., AWS Certified Solutions Architect). • Experience in Agile/Scrum methodologies. • Familiarity with networking, security and kubernetes, and containerization tools. • Prior experience leading multi-disciplinary teams and managing onshore/offshore delivery. • Prior experience working on ETL pipelines. Soft Skills: • Excellent communication and stakeholder management skills. • Ability to collaborate and interact team members effectively. • Strong analytical thinking and decision-making abilities.
Posted 1 month ago
4.0 years
0 Lacs
India
On-site
Job Summary: We are seeking a highly skilled Java Migration Engineer to support the migration of enterprise Java applications from IBM WebSphere to an open-source Java stack (e.g., Spring Boot, Tomcat) deployed on Kubernetes. The ideal candidate should have a strong background in Java EE, microservices architecture, and hands-on experience with containerization and cloud-native deployments. Key Responsibilities: Analyze and assess existing WebSphere-based Java applications for migration readiness. Refactor and migrate Java EE applications to Spring Boot or similar open-source frameworks. Replace WebSphere-specific services (e.g., JMS, JTA, JNDI) with open-source equivalents. Containerize applications using Docker and deploy/manage them on Kubernetes clusters. Develop Helm charts or Kubernetes YAML manifests for deployment automation. Implement CI/CD pipelines for seamless builds, testing, and deployments (e.g., Jenkins, GitLab CI). Perform performance tuning, logging, and monitoring in a cloud-native environment. Collaborate with DevOps, architecture, and QA teams to ensure smooth migration and integration. Document migration steps, architectural changes, and configurations. Required Skills and Qualifications: 4+ years of hands-on experience with Java/J2EE development. Solid understanding of IBM WebSphere application server internals and configurations. Proven experience migrating apps to Spring Boot, Tomcat, or Jetty. Strong experience with Docker, Kubernetes, and cloud-native deployments. Familiarity with service mesh (e.g., Istio), API gateways, and distributed systems. Good grasp of microservices architecture and 12-factor app principles. Experience with Git, build tools (Maven/Gradle), and CI/CD tools (Jenkins, ArgoCD, etc.). Excellent debugging, problem-solving, and performance tuning skills. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, or GCP). Exposure to legacy monolith decomposition and modernization projects. Knowledge of logging and monitoring tools (Prometheus, Grafana, ELK). Experience working with JMS replacement solutions (e.g., RabbitMQ, Kafka). Familiarity with configuration tools like Spring Cloud Config or HashiCorp Vault. Certifications in Kubernetes (CKA/CKAD) or cloud platforms are a plus.
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: Highly skilled Frontend (UI) Developer specialized in Web Applications with a strong background on React JS for Gen AI projects in Azure (preferred), AWS environments. Strong proficiency in React JS and its core principles Familiarity with RESTful APIs and modern front-end build pipelines and tools. Familiar with conversational AI and LLM Models. Required Experience: 8+ years of overall experience that includes at least 4+ years of hands-on work experience in React JS Minimum 5+ years of HTML, CSS, and JavaScript experience Minimum 5+ years of React JS experience Experience in DevOps, hands-on experience with one or more cloud service providers Azure (preferred), AWS, GCP BE / B. Tech in Computer Science, Maths, technical fields. Stakeholder engagement-BU, Vendors. Skills, Abilities, Knowledge: Strong proficiency in React JS and its core principles. Experience with popular React workflows (such as Redux or Context API). Proficiency in HTML, CSS, and JavaScript. Familiarity with RESTful APIs and modern frontend build pipelines and tools. Develop and maintain user interfaces for web applications using React JS. Collaborate with designers and backend developers to create seamless and responsive user experiences. Optimize applications for maximum speed and scalability. Write clean, maintainable, and efficient code. Implement and adhere to best practices in frontend development. Troubleshoot and debug issues as they arise. Experience with TypeScript. Knowledge of modern authorization mechanisms, such as JSON Web Token (JWT). Familiarity with frontend testing frameworks (such as Jest or Mocha). Understanding of server-side rendering and its benefits. Experience with version control systems like GitHub and CI/CD tools. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. About the Company: Everest DX – We are a Digital Platform Services company, headquartered in Stamford. Our Platform/Solution includes Orchestration, Intelligent operations with BOTs’, AI-powered analytics for Enterprise IT. Our vision is to enable Digital Transformation for enterprises to deliver seamless customer experience, business efficiency and actionable insights through an integrated set of futuristic digital technologies. Digital Transformation Services - Specialized in Design, Build, Develop, Integrate, and Manage cloud solutions and modernize Data centers, build a Cloud-native application and migrate existing applications into secure, multi-cloud environments to support digital transformation. Our Digital Platform Services enable organizations to reduce IT resource requirements and improve productivity, in addition to lowering costs and speeding digital transformation. Digital Platform - Cloud Intelligent Management (CiM) - An Autonomous Hybrid Cloud Management Platform that works across multi-cloud environments. helps enterprise Digital Transformation get most out of the cloud strategy while reducing Cost, Risk and Speed. To know more please visit: http://www.everestdx.com
Posted 1 month ago
5.0 years
0 Lacs
Bardez, Goa, India
Remote
Apply with Indeed Back to all openings See all the jobs at Frontline Managed Services here: http://frontlinems.recruiterbox.com/jobs Azure Cloud Architect - Azure VMware Porvorim, Goa, India | MITS | Full-time | Fully remote Apply Welcome to Frontline Managed Services® – where innovation, technology, and efficiency converge to redefine the landscape of IT, Financial, and Administrative Managed Services for legal and professional service firms. As pioneers in the industry, we are driven by a relentless commitment to excellence. Join Our Team and Be a Catalyst for Change! We don't just follow industry standards; we set them. Our dynamic environment thrives on pushing boundaries and embracing challenges. We are more than a workplace; we are a community of forward-thinkers dedicated to shaping the future. Position Overview The Azure Cloud Engineer will lead the design, deployment, and support of Azure VMware Solution (AVS) environments for law firms and other professional services clients. You’ll architect and manage scalable hybrid cloud infrastructures, enabling seamless integration between on-premises VMware workloads and Azure-native services. Expect deep hands-on work in AVS components such as vSphere, vSAN, and NSX-T, along with Azure networking, identity, and storage. You’ll collaborate closely with client IT teams, often in high-uptime or regulated environments, to deliver secure, performant solutions that align with business needs. Work Location – Remote Desired work hours – US Business Hours (Approx. 7:00pm – 3:30am) Salary Budget – 30-40LPA What You’ll Be Responsible For Design and deploy Azure VMware Solution (AVS) environments for hybrid cloud infrastructure Migrate on-premises VMware workloads to AVS with minimal downtime and strong rollback plans Configure and manage vSphere, vSAN, and NSX-T within the AVS platform Integrate AVS with Azure services including networking, identity, storage, and monitoring Implement connectivity solutions such as ExpressRoute and site-to-site VPNs Support identity federation and access control using Azure AD and on-prem Active Directory Establish backup, disaster recovery, and high availability strategies within AVS Document architecture, client-specific configurations, and operational runbooks Qualifications 5+ years in infrastructure, cloud, or systems engineering roles, with 2+ years focused on VMware in Azure or hybrid cloud environments Deep hands-on experience with Azure VMware Solution (AVS), including deployment, management, and optimization Strong expertise in vSphere, vSAN, and NSX-T administration Solid understanding of Azure networking, including VNet peering, ExpressRoute, and Network Security Groups Experience integrating AVS with Azure AD, storage (Blob, Azure Files), and monitoring tools (e.g., Azure Monitor, Log Analytics) Familiarity with scripting and automation tools (e.g., PowerShell, ARM templates, Terraform, or Bicep) Proven ability to interface with clients, manage cross-functional expectations, and deliver in high-uptime environments Nice to have PowerShell scripting skills for session host automation and diagnostics Experience with Nerdio Manager, Azure Lighthouse, or Windows 365 Certifications: Microsoft Certified: Azure Virtual Desktop Specialty Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Administrator Associate It's not expected that any single candidate would have expertise across all these areas. If you believe you meet a majority and are excited to learn what you do not already know, Please Apply! “We are an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.” Apply Apply for this opening at ?apply=true Back to all openings See all the jobs at Frontline Managed Services here: http://frontlinems.recruiterbox.com/jobs Fetching your Linkedin profile ... Application Form First Name * Last Name Email * Phone * Resume * Street Address * City * State or Province * Zip/Postal Code * Are you prepared to work US Business hours? (7:00PM to 3:30AM) * --Select-- Yes No Thanks for your time Share this opening with friends
Posted 1 month ago
10.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About Fusemachines Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world. About The Role This is a remote, contract position responsible for designing, building, and maintaining the infrastructure required for data integration, storage, processing, and analytics (BI, visualization and Advanced Analytics). We are looking for a skilled Senior Data Engineer with a strong background in Python, SQL, PySpark, Azure, Databricks, Synapse, Azure Data Lake, DevOps and cloud-based large scale data applications with a passion for data quality, performance and cost optimization. The ideal candidate will develop in an Agile environment, contributing to the architecture, design, and implementation of Data products in the Aviation Industry, including migration from Synapse to Azure Data Lake. This role involves hands-on coding, mentoring junior staff and collaboration with multi-disciplined teams to achieve project objectives. Qualification & Experience Must have a full-time Bachelor's degree in Computer Science or similar At least 5 years of experience as a data engineer with strong expertise in Databricks, Azure, DevOps, or other hyperscalers 5+ years of experience with Azure DevOps, GitHub Proven experience delivering large scale projects and products for Data and Analytics, as a data engineer, including migrations Following certifications: Databricks Certified Associate Developer for Apache Spark Databricks Certified Data Engineer Associate Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Data Engineer Associate Microsoft Exam: Designing and Implementing Microsoft DevOps Solutions (nice to have) Required Skills/Competencies Strong programming Skills in one or more languages such as Python (must have), Scala, and proficiency in writing efficient and optimized code for data integration, migration, storage, processing and manipulation Strong understanding and experience with SQL and writing advanced SQL queries Thorough understanding of big data principles, techniques, and best practices Strong experience with scalable and distributed Data Processing Technologies such as Spark/PySpark (must have: experience with Azure Databricks), DBT and Kafka, to be able to handle large volumes of data Solid Databricks development experience with significant Python, PySpark, Spark SQL, Pandas, NumPy in Azure environment Strong experience in designing and implementing efficient ELT/ETL processes in Azure and Databricks and using open source solutions being able to develop custom integration solutions as needed Skilled in Data Integration from different sources such as APIs, databases, flat files, event streaming Expertise in data cleansing, transformation, and validation Proficiency with Relational Databases (Oracle, SQL Server, MySQL, Postgres, or similar) and NonSQL Databases (MongoDB or Table) Good understanding of Data Modeling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions Strong experience in designing and implementing Data Warehousing, data lake and data lake house, solutions in Azure and Databricks Good experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT) Strong understanding of the software development lifecycle (SDLC), especially Agile methodologies Strong knowledge of SDLC tools and technologies Azure DevOps and GitHub, including project management software (Jira, Azure Boards or similar), source code management (GitHub, Azure Repos or similar), CI/CD system (GitHub actions, Azure Pipelines, Jenkins or similar) and binary repository manager (Azure Artifacts or similar) Strong understanding of DevOps principles, including continuous integration, continuous delivery (CI/CD), infrastructure as code (IaC – Terraform, ARM including hands-on experience), configuration management, automated testing, performance tuning and cost management and optimization Strong knowledge in cloud computing specifically in Microsoft Azure services related to data and analytics, such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake, Azure Stream Analytics, SQL Server, Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database, etc Experience in Orchestration using technologies like Databricks workflows and Apache Airflow Strong knowledge of data structures and algorithms and good software engineering practices Proven experience migrating from Azure Synapse to Azure Data Lake, or other technologies Strong analytical skills to identify and address technical issues, performance bottlenecks, and system failures Proficiency in debugging and troubleshooting issues in complex data and analytics environments and pipelines Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent Experience with BI solutions including PowerBI is a plus Strong written and verbal communication skills to collaborate and articulate complex situations concisely with cross-functional teams, including business users, data architects, DevOps engineers, data analysts, data scientists, developers, and operations teams Ability to document processes, procedures, and deployment configurations Understanding of security practices, including network security groups, Azure Active Directory, encryption, and compliance standards Ability to implement security controls and best practices within data and analytics solutions, including proficient knowledge and working experience on various cloud security vulnerabilities and ways to mitigate them Self-motivated with the ability to work well in a team, and experienced in mentoring and coaching different members of the team A willingness to stay updated with the latest services, Data Engineering trends, and best practices in the field Comfortable with picking up new technologies independently and working in a rapidly changing environment with ambiguous requirements Care about architecture, observability, testing, and building reliable infrastructure and data pipelines Responsibilities Architect, design, develop, test and maintain high-performance, large-scale, complex data architectures, which support data integration (batch and real-time, ETL and ELT patterns from heterogeneous data systems: APIs and platforms), storage (data lakes, warehouses, data lake houses, etc), processing, orchestration and infrastructure. Ensuring the scalability, reliability, and performance of data systems, focusing on Databricks and Azure Contribute to detailed design, architectural discussions, and customer requirements sessions Actively participate in the design, development, and testing of big data products. Construct and fine-tune Apache Spark jobs and clusters within the Databricks platform Migrate out of Azure Synapse to Azure Data Lake or other technologies Assess best practices and design schemas that match business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive) Design and implement data models and schemas that support efficient data processing and analytics Design and develop clear, maintainable code with automated testing using Pytest, unittest, integration tests, performance tests, regression tests, etc Collaborating with cross-functional teams and Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable components meeting product deliverables Evaluating and implementing new technologies and tools to improve data integration, data processing, storage and analysis Evaluate, design, implement and maintain data governance solutions: cataloging, lineage, data quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns Continuously monitor and fine-tune workloads and clusters to achieve optimal performance Provide guidance and mentorship to junior team members, sharing knowledge and best practices Maintain clear and comprehensive documentation of the solutions, configurations, and best practices implemented Promote and enforce best practices in data engineering, data governance, and data quality Ensure data quality and accuracy Design, Implement and maintain data security and privacy measures Be an active member of an Agile team, participating in all ceremonies and continuous improvement activities, being able to work independently as well as collaboratively Fusemachines is an Equal Opportunities Employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws. Powered by JazzHR YyCZFu6HUv
Posted 1 month ago
7.0 years
0 Lacs
Andhra Pradesh, India
On-site
7+ years experience as a developer Independently work and collaborate with existing team Create new ETL process using Python and get processes to ETL Analyse existing datastage processes and migrate them to Python based snowflake ETL independently Mandatory skills Python Snowflake SQL
Posted 1 month ago
7.0 years
0 Lacs
India
Remote
For an international project in Chennai, we are urgently looking for a Full Remote Senior SQL/Postgres DB Developer. This position is for a highly motivated, driven candidate with a passion for data analysis and database development, with experience focused on complex data transformation and migration. We are looking for a motivated contractor. Candidates need tbe fluent in English. Tasks and responsibilities: Data Migration strategy; Development of Logical & Physical DB models; Development of DB tables, Stored procedures, scripts, Data migration scripts and reconciliation reports as required; Unit Testing; Defects fixes in QA & UAT; Deployment in production; Profile: Bachelor or Master degree; +7 years of relevant experience working with databases like SQL and Postgres with specialized experience in Data migration for highly complex applications; Experience in strategizing & implementing, large data migration using scripts & data migration / transformation tools; Design solutions to migrate data across different databases; Excellent speaking, listening, and writing skills, with attention to detail; Proactive self-starter, able to deliver in tight timelines, and make adjustments for fast-paced changing priorities; Implement and maintain PostgreSQL database code in the form of stored procedures, scripts, queries, views, triggers, etc; Implement effective and maintainable database coding practices that form an architectural foundation; Analyze existing SQL query and PLPG/SQL code for performance improvements; Develop functions, PL/SQL statements and stored procedures PostgreSQL to maintain business logic; Hands-on experience with Azure Database for PostgreSQL and Azure Key Vault for secure credential management; Familiarity with Azure Web API, Azure Functions, and Azure AD authentication for secure data access; Design relational logical data models and their physical schema design; Investigates, troubleshoots, and corrects data and user-related system errors; Fluent in English;
Posted 1 month ago
0 years
0 Lacs
Andaman and Nicobar Islands, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyse business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the IT Engineering Manager, and work in a hybrid capacity from our Hinjewadi-Pune, India office. Your Responsibilities Maintain the product classification system within the PIM Maintain the product marketing taxonomy within the PIM, with the Global Website Portfolio team Acquire product feature information for configurable products, develop selection option-based rules for material variant feature generation for fixed-bill-of-material products, acquire catalogue number-based features Periodically create and load product features for new configurable product variants Prepare new products for the PIM by establishing linkages to taxonomy, classification system, images, documentation, and drawings Publish new products to the online catalogue Monitor PIM data quality and completeness Maintain PIM data Build the PIM translation process Build PIM enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field Experience with Data Setup Experience working with External Data sources: Establishing processes to load data , 24x7 site maintence e.g. tax, product availability, pricing Migration: Develop tools to migrate transactional data from old to new systems. Experience in Export/Reporting: Established processes to extract/transfer data to other systems and data layers e.g. ROKFusion(wrt Rockwell), Other similar Systems and Tools The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products Familiarity with ERP material master data concepts, including configuration maintain data in the context of PIM systems and MDM systems New technologies and changing our requirements Ability to work with multiple partners and influence project decisions Temperament To and assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement and build processes across business/function to achieve assigned goals Analytical skills; ability to distil information from different data sources and the capability to tell the "story" behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.
Posted 1 month ago
0 years
0 Lacs
Andaman and Nicobar Islands, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyze business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the IT Engineering Manager and work in a hybrid capacity from our Hinjewadi-Pune, India office. Your Responsibilities Maintain the product classification system within the PIM Maintain the product marketing taxonomy within the PIM, with the Global Website Portfolio team Acquire product feature information for configurable products, develop selection option-based rules for material variant feature generation for fixed-bill-of-material products, acquire catalogue number-based features Periodically create and load product features for new configurable product variants Prepare new products for the PIM by establishing linkages to taxonomy, classification system, images, documentation, and drawings Publish new products to the online catalogue Monitor PIM data quality and completeness Maintain PIM data Build the PIM translation process Build PIM enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field Experience with Data Setup Experience working with External Data sources: Establishing processes to load data , 24x7 site maintence e.g. tax, product availability, pricing Migration: Develop tools to migrate transactional data from old to new systems. Experience in Export/Reporting: Established processes to extract/transfer data to other systems and data layers e.g. ROKFusion (wrt Rockwell), Other similar Systems and Tools The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products Familiarity with ERP material master data concepts, including configuration Ability to maintain data in the context of PIM systems and MDM systems Ability to new technologies and changing our requirements Ability to work with multiple partners and influence project decisions Temperament Ability to adapt to and assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Ability to work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement and execute processes across business/function to achieve assigned goals Strong analytical skills; ability to distil information from different data sources and the capability to tell the “story” behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.
Posted 1 month ago
0 years
0 Lacs
Delhi, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyse business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the IT Engineering Manager, and work in a hybrid capacity from our Hinjewadi-Pune, India office. Your Responsibilities Maintain the product classification system within the PIM Maintain the product marketing taxonomy within the PIM, with the Global Website Portfolio team Acquire product feature information for configurable products, develop selection option-based rules for material variant feature generation for fixed-bill-of-material products, acquire catalogue number-based features Periodically create and load product features for new configurable product variants Prepare new products for the PIM by establishing linkages to taxonomy, classification system, images, documentation, and drawings Publish new products to the online catalogue Monitor PIM data quality and completeness Maintain PIM data Build the PIM translation process Build PIM enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field Experience with Data Setup Experience working with External Data sources: Establishing processes to load data , 24x7 site maintence e.g. tax, product availability, pricing Migration: Develop tools to migrate transactional data from old to new systems. Experience in Export/Reporting: Established processes to extract/transfer data to other systems and data layers e.g. ROKFusion(wrt Rockwell), Other similar Systems and Tools The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products Familiarity with ERP material master data concepts, including configuration maintain data in the context of PIM systems and MDM systems New technologies and changing our requirements Ability to work with multiple partners and influence project decisions Temperament To and assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement and build processes across business/function to achieve assigned goals Analytical skills; ability to distil information from different data sources and the capability to tell the "story" behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.
Posted 1 month ago
0 years
0 Lacs
Delhi, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyze business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the IT Engineering Manager and work in a hybrid capacity from our Hinjewadi-Pune, India office. Your Responsibilities Maintain the product classification system within the PIM Maintain the product marketing taxonomy within the PIM, with the Global Website Portfolio team Acquire product feature information for configurable products, develop selection option-based rules for material variant feature generation for fixed-bill-of-material products, acquire catalogue number-based features Periodically create and load product features for new configurable product variants Prepare new products for the PIM by establishing linkages to taxonomy, classification system, images, documentation, and drawings Publish new products to the online catalogue Monitor PIM data quality and completeness Maintain PIM data Build the PIM translation process Build PIM enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field Experience with Data Setup Experience working with External Data sources: Establishing processes to load data , 24x7 site maintence e.g. tax, product availability, pricing Migration: Develop tools to migrate transactional data from old to new systems. Experience in Export/Reporting: Established processes to extract/transfer data to other systems and data layers e.g. ROKFusion (wrt Rockwell), Other similar Systems and Tools The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products Familiarity with ERP material master data concepts, including configuration Ability to maintain data in the context of PIM systems and MDM systems Ability to new technologies and changing our requirements Ability to work with multiple partners and influence project decisions Temperament Ability to adapt to and assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Ability to work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement and execute processes across business/function to achieve assigned goals Strong analytical skills; ability to distil information from different data sources and the capability to tell the “story” behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyse business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the IT Engineering Manager, and work in a hybrid capacity from our Hinjewadi-Pune, India office. Your Responsibilities Maintain the product classification system within the PIM Maintain the product marketing taxonomy within the PIM, with the Global Website Portfolio team Acquire product feature information for configurable products, develop selection option-based rules for material variant feature generation for fixed-bill-of-material products, acquire catalogue number-based features Periodically create and load product features for new configurable product variants Prepare new products for the PIM by establishing linkages to taxonomy, classification system, images, documentation, and drawings Publish new products to the online catalogue Monitor PIM data quality and completeness Maintain PIM data Build the PIM translation process Build PIM enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field Experience with Data Setup Experience working with External Data sources: Establishing processes to load data , 24x7 site maintence e.g. tax, product availability, pricing Migration: Develop tools to migrate transactional data from old to new systems. Experience in Export/Reporting: Established processes to extract/transfer data to other systems and data layers e.g. ROKFusion(wrt Rockwell), Other similar Systems and Tools The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products Familiarity with ERP material master data concepts, including configuration maintain data in the context of PIM systems and MDM systems New technologies and changing our requirements Ability to work with multiple partners and influence project decisions Temperament To and assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement and build processes across business/function to achieve assigned goals Analytical skills; ability to distil information from different data sources and the capability to tell the "story" behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyze business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the IT Engineering Manager and work in a hybrid capacity from our Hinjewadi-Pune, India office. Your Responsibilities Maintain the product classification system within the PIM Maintain the product marketing taxonomy within the PIM, with the Global Website Portfolio team Acquire product feature information for configurable products, develop selection option-based rules for material variant feature generation for fixed-bill-of-material products, acquire catalogue number-based features Periodically create and load product features for new configurable product variants Prepare new products for the PIM by establishing linkages to taxonomy, classification system, images, documentation, and drawings Publish new products to the online catalogue Monitor PIM data quality and completeness Maintain PIM data Build the PIM translation process Build PIM enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field Experience with Data Setup Experience working with External Data sources: Establishing processes to load data , 24x7 site maintence e.g. tax, product availability, pricing Migration: Develop tools to migrate transactional data from old to new systems. Experience in Export/Reporting: Established processes to extract/transfer data to other systems and data layers e.g. ROKFusion (wrt Rockwell), Other similar Systems and Tools The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products Familiarity with ERP material master data concepts, including configuration Ability to maintain data in the context of PIM systems and MDM systems Ability to new technologies and changing our requirements Ability to work with multiple partners and influence project decisions Temperament Ability to adapt to and assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Ability to work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement and execute processes across business/function to achieve assigned goals Strong analytical skills; ability to distil information from different data sources and the capability to tell the “story” behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Senior Java Developer Location: Hyderabad Experience: 5+ years Employment Type: Full-Time Notice Period - Immediate to 15 days joiners are highly preferred Job Summary: We are looking for a highly skilled Senior Java Developer with 5+ years of experience to join our engineering team. The ideal candidate will have hands-on expertise in Java 8/17 , Spring Boot , Spring Cloud , Kafka , and Camunda , and be adept in building scalable microservices and integrating cloud-native technologies. Key Responsibilities: Design, develop, and maintain scalable microservices using Java 8/17 , Spring Boot , and Spring Cloud . Implement secure, robust applications with Spring Security . Develop and consume RESTful Web Services and integrate services across distributed systems. Work with Apache Kafka for real-time data streaming and messaging use cases. Build process automation workflows using Camunda BPM . Write efficient and testable code using JUnit and adhere to best coding practices. Design database schemas and write queries for both SQL and NoSQL (CosmosDB) databases. Troubleshoot issues, optimize performance, and improve reliability of backend services. Collaborate with cross-functional teams including architects, DevOps, QA, and product teams to deliver high-quality software solutions. Required Skills & Experience: 5+ years of hands-on experience in Java 8 and/or Java 17 development. Strong expertise in Spring Boot , Spring Cloud , and Spring Security . Solid experience with Apache Kafka and Camunda BPM . Proficient in Microservices architecture and integration patterns. Hands-on experience with RESTful APIs , JUnit , and testing frameworks. Good understanding of SQL databases and NoSQL databases like CosmosDB . Strong problem-solving skills and ability to work in Agile environments. Preferred Qualifications: Experience with cloud platforms such as Azure , AWS , or GCP . Familiarity with CI/CD pipelines and containerization (e.g., Docker, Kubernetes). Knowledge of logging and monitoring tools. About the Company: Everest DX – We are a Digital Platform Services company, headquartered in Stamford. Our Platform/Solution includes Orchestration, Intelligent operations with BOTs’, AI-powered analytics for Enterprise IT. Our vision is to enable Digital Transformation for enterprises to deliver seamless customer experience, business efficiency and actionable insights through an integrated set of futuristic digital technologies. Digital Transformation Services - Specialized in Design, Build, Develop, Integrate, and Manage cloud solutions and modernize Data centers, build a Cloud-native application and migrate existing applications into secure, multi-cloud environments to support digital transformation. Our Digital Platform Services enable organizations to reduce IT resource requirements and improve productivity, in addition to lowering costs and speeding digital transformation. Digital Platform - Cloud Intelligent Management (CiM) - An Autonomous Hybrid Cloud Management Platform that works across multi-cloud environments. helps enterprise Digital Transformation get most out of the cloud strategy while reducing Cost, Risk and Speed. To know more please visit: http://www.everestdx.com
Posted 1 month ago
1.0 - 2.0 years
4 - 5 Lacs
India
On-site
About Our Team The Peer Review team is a dynamic and collaborative group dedicated to managing and facilitating the seamless journey of academic manuscripts from submission to decision on peer review. We serve as the central point of contact for authors, editors, and reviewers, ensuring the integrity, transparency, and efficiency of the peer review process. With a strong commitment to accuracy, responsiveness, and service excellence, we play a vital role in supporting the publication of high-quality scholarly research. What is your team’s key role in the business? The Peer Review team is a foundational part of Sage Publishing’s commitment to academic excellence. Peer review is the process by which experts in a relevant field evaluate a manuscript's quality, validity, and relevance before it is published in a journal. It ensures the integrity and credibility of scholarly research. As a Peer Review Associate (PRA), you will play a critical role in managing the end-to-end peer review process for scholarly journals using Sage Track. Our team’s responsibilities include screening incoming manuscripts for compliance with submission guidelines, coordinating reviewer assignments and follow-ups, and ensuring timely completion of tasks by editors, reviewers, and authors. We serve as the first point of contact for editorial queries, troubleshoot technical issues, maintain accurate records and templates, and ensure that accepted manuscripts are ready for production. Additionally, we act as a liaison between journal editors and Sage, upholding high standards of communication, organization, and responsiveness to support the timely and smooth operation of each journal’s peer review workflow. What other departments do you work closely with? We collaborate with several key departments to ensure a smooth and efficient workflow: Editorial: To ensure the smooth and timely progression of manuscripts at each stage of peer review, support editorial board needs, and uphold peer review standards. Production Operations: To make sure accepted manuscripts are ready for the production team. Customer Services: For handling author and reviewer queries and maintaining satisfaction. Journals Operations & APC Teams: For license management and processing Article Processing Charges (APCs). Commercial Sales & Marketing: To support journal growth and visibility through timely and quality-driven processes. Vendors: Partnering with external vendors for peer review support services. Could you be our new Peer Review Associate? Are you? We are looking for a detail-oriented, proactive Peer Review Associate to oversee and manage the peer review process for a portfolio of academic journals. The ideal candidate will have excellent eye for detail, strong communication, and problem-solving skills, along with the ability to work effectively across departments and with external stakeholders based globally. This is a great opportunity to grow and build your career while contributing to the advancement of scholarly publishing in a collaborative, fast-paced environment. Key Responsibilities Manuscript Management Review and process incoming manuscripts via SAGE Track, ensuring they meet submission criteria and are ready for peer review. Invite and assign reviewers; support them throughout the review process. Monitor pending tasks for editors, associate editors (if applicable), reviewers, and authors, and send timely follow-up reminders. Post-Acceptance Checks Ensure authors of accepted manuscripts complete and submit contributor forms via SAGE Track. Review accepted manuscripts for completeness and readiness for production (e.g., author contact info, editable file formats, permissions, and reference style compliance). Export completed manuscripts to the SAGE Production Editor in alignment with article deadlines. Communication & Support Respond promptly (within 24 hours, excluding weekends and holidays) to queries from journal editors, associate editors, authors, reviewers, and SAGE staff. Maintain and update email templates in SAGE Track according to journal-specific needs. Coordinate with ScholarOne support for any technical issues encountered on SAGE Track. Journal Oversight & Relationship Management For journals supported by Editorial Assistants, oversee peer review health by: Troubleshooting site issues Managing editor relationships Guiding Editorial Assistants Serving as their first point of escalation Act as the primary liaison between SAGE and journal editors, communicating key updates and ensuring smooth collaboration. Share Editorial Board updates with relevant Global Publishing Editors and Production Editors. Reporting & Monitoring Maintain a Daily Tracker to record ongoing tasks and activities. Submit weekly user performance reports to the Peer Review Supervisor and US Manager. Populate the weekly Overdue Task Report with updated journal comments. Run and share reports from SAGE Track periodically as requested. System & Site Maintenance Troubleshoot functionality issues on SAGE Track. Ensure journal sites remain current and aligned with global standards and initiatives. Collaborate with US and UK teams to implement peer review systems for new journals or migrate existing ones. Productivity Standards Manage a manuscript workload in alignment with the annual Work Allocation Plan (WAP) post-training. Provide timely, professional, and solution-oriented responses to all stakeholders. Support journal editors and internal teams with special projects and initiatives, as needed. Adhere to journal-specific editorial guidelines and processes outlined in the Journal Editor’s Guide. Teamwork & Collaboration Contribute ideas and feedback constructively to improve team operations. Assist fellow team members on designated journals when needed. Participate in departmental projects, committees, or task forces. Foster a collaborative, respectful, and positive team environment. Required Skills & Competencies Excellent written and verbal communication Strong attention to detail and organizational skills Ability to manage multiple tasks and meet tight deadlines Problem-solving mindset with a proactive approach Team-oriented, cooperative, and respectful demeanor Adaptable to new systems, processes, and priorities Qualifications & Experience Bachelor’s Degree (required) 1–2 years of relevant experience in the publishing industry, preferably in peer review or editorial support Diversity, Equity, and Inclusion At Sage we are committed to building a diverse and inclusive team that is representative of all sections of society and to sustaining a culture that celebrates difference, encourages authenticity, and creates a deep sense of belonging. We welcome applications from all members of society irrespective of age, disability, sex or gender identity, sexual orientation, color, race, nationality, ethnic or national origin, religion or belief as creating value through diversity is what makes us strong. As a business and as an organization with an increasingly agile workforce, we're open to flexible working arrangements where appropriate.
Posted 1 month ago
2.0 years
0 Lacs
Hyderābād
On-site
Job Summary: We are looking for an experienced and innovative Zoho Developer to join our team. The ideal candidate will have hands-on experience in designing, developing, and implementing solutions using Zoho applications, especially Zoho CRM, Creator, Books, Desk, and other Zoho One apps. You will work closely with stakeholders to automate processes, enhance productivity, and deliver customized Zoho-based solutions. Key Responsibilities: n Customize and configure Zoho CRM modules, workflows, functions, and templates. n Develop and maintain custom applications using Zoho Creator, Deluge scripting, and APIs. n Integrate Zoho apps with third-party tools (e.g., Google Workspace, payment gateways, external APIs). n Build dashboards, reports, and automation using Zoho Analytics and Zoho Flow. n Migrate data between platforms and ensure system integrity. n Troubleshoot and resolve application issues, bugs, or errors. n Work closely with the sales, marketing, HR, and finance teams to gather requirements and optimize workflows. n Provide support, training, and documentation for end users. Requirements: n Proven experience as a Zoho Developer or similar role (minimum 2+ years). n Proficiency in Deluge script, Zoho APIs, and custom functions. n Hands-on experience with Zoho CRM, Zoho Creator, Zoho Books, Zoho Desk, etc. n Strong knowledge of REST APIs and JSON/XML formats. n Familiarity with JavaScript, HTML/CSS, and database concepts is a plus. n Excellent problem-solving skills and attention to detail. n Ability to manage multiple projects and meet deadlines. Preferred Qualifications: n Zoho Certifications (Zoho CRM Certified Consultant, Creator Developer, etc.) n Experience in project management tools like Zoho Projects or Asana. n Background in SaaS product development or cloud platforms. Benefits: n Competitive salary n Learning and development support n Opportunity to work on diverse and exciting Zoho projects Job Type: Full-time Location Type: In-person Work Location: In person
Posted 1 month ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education
Posted 1 month ago
5.0 years
0 Lacs
India
On-site
Senior Backend Engineer - Real-Time Trading Platform ChartRaider Gaming PlatformCompany Overview Join our innovative team building ChartRaider, a cutting-edge real-time trading simulation platform that combines the excitement of competitive gaming with financial market education. Our platform enables up to 1000 concurrent users to participate in live trading matches using real BTCUSDT price feeds in a gamified environment. Position Summary We are seeking an exceptional Senior Backend Engineer to architect, implement, and scale our real-time multiplayer trading platform. This role requires deep expertise in building high-performance, fault-tolerant systems capable of handling intense real-time interactions, complex matchmaking logic, and sophisticated trading mechanics. The ideal candidate will have proven experience designing microservices architectures, implementing real-time communication systems, and managing cloud infrastructure at scale. You'll be working on a system that processes live financial data, manages concurrent user sessions, and provides seamless real-time experiences across multiple game modes. Key ResponsibilitiesSystem Architecture & Design Design and implement a scalable microservices architecture supporting concurrent real-time trading matches for 1000+ users Architect fault-tolerant systems with proper separation of concerns between API services, real-time services, and background workers Design and optimize data flow patterns for high-frequency trading simulations and live leaderboard updates Implement robust session management and reconnection logic for seamless user experiences during network interruptions Real-Time Systems Development Build and maintain WebSocket-based real-time communication systems using Socket.IO for live trading interactions Implement low-latency price feed integration with external financial data providers (Binance WebSocket API) Design and optimize Redis-based state management for live match data, player positions, and real-time leaderboards Develop complex trading logic including position management, stop-loss/take-profit automation, and profit/loss calculations Matchmaking & Game Logic Implement sophisticated matchmaking algorithms with atomic operations to ensure fair and balanced matches Design and build background worker systems for asynchronous match formation and data persistence Develop complex in-match mechanics including Action Bar progression, Peek Meter abilities, and Trade Delay features Create robust queue management systems with proper handling of edge cases and failure scenarios API Development & Integration Design comprehensive RESTful APIs for lobby management, user configuration, and historical data access Implement secure authentication and authorization systems using JWT tokens Build notification systems supporting both real-time alerts and persistent messaging Develop admin interfaces and monitoring endpoints for system health and match management Cloud Infrastructure & DevOps Manage AWS infrastructure including EC2 Auto Scaling Groups, Application Load Balancers, and ElastiCache clusters Implement and maintain CI/CD pipelines using GitHub Actions for automated testing and deployment Design disaster recovery procedures and implement proper backup strategies for critical data Monitor system performance and implement alerting for critical system metrics Data Management Design efficient database schemas using PostgreSQL for user data, match results, and audit trails Implement proper data persistence patterns with background workers for non-blocking database operations Optimize Redis usage for caching, session management, and real-time state storage Ensure data consistency across distributed systems and implement proper transaction management Required Technical SkillsCore Technologies Node.js & Express : Minimum 5+ years of production experience building scalable backend services WebSocket Technologies : Extensive experience with Socket.IO or similar real-time communication frameworks Redis : Advanced knowledge of Redis data structures, pub/sub patterns, and cluster management PostgreSQL : Strong database design skills and experience with complex queries and optimization TypeScript : Proficiency in type-safe JavaScript development and modern ES6+ features Cloud & Infrastructure AWS Services : Hands-on experience with EC2, Auto Scaling Groups, Application Load Balancers, ElastiCache, and RDS Containerization : Experience with Docker and container orchestration (bonus: ECS/Fargate experience) CI/CD : Practical experience implementing automated deployment pipelines and infrastructure as code Monitoring : Familiarity with CloudWatch, logging strategies, and system observability practices Distributed Systems Microservices : Proven experience designing and implementing service-oriented architectures Message Queues : Experience with Redis pub/sub, job queues, and asynchronous processing patterns API Design : Strong understanding of RESTful API design principles and WebSocket event handling Caching Strategies : Knowledge of distributed caching patterns and cache invalidation strategies Preferred QualificationsAdvanced Experience Financial Systems : Previous experience building trading platforms, financial data processing, or real-time market data systems Gaming Backend : Experience with multiplayer game servers, matchmaking systems, or real-time gaming platforms High-Scale Systems : Demonstrated experience building systems handling 1000+ concurrent connections Performance Optimization : Track record of identifying and resolving performance bottlenecks in distributed systems Additional Technical Skills Serverless Technologies : Experience with AWS Lambda, Fargate, or other serverless computing platforms Monitoring & Observability : Familiarity with X-Ray, Datadog, New Relic, or similar monitoring solutions Security : Knowledge of authentication systems, secure API design, and data protection practices Load Testing : Experience with performance testing tools and strategies for validating system scalability What You'll Be Working OnImmediate Projects Migrate from monolithic architecture to distributed microservices running on Auto Scaling Groups Implement ElastiCache Redis clusters for improved performance and horizontal scaling Build comprehensive matchmaking system supporting multiple game modes and skill-based matching Develop real-time trading engine with complex position management and automated execution features Future Initiatives Transition to containerized deployments using ECS and Fargate for improved scalability and deployment velocity Implement advanced observability with distributed tracing and comprehensive monitoring dashboards Build sophisticated anti-cheat systems and fraud detection mechanisms Expand platform to support additional trading instruments beyond BTCUSDT Team & CultureCollaboration You'll work closely with our frontend developers building the Electron desktop application and web interface, product managers defining game mechanics and user experiences, and DevOps engineers managing our cloud infrastructure. We value technical excellence, code quality, and collaborative problem-solving. Growth Opportunities This role offers significant opportunities for technical leadership and architectural decision-making. You'll have the chance to influence major technology choices, mentor junior developers, and shape the technical direction of our platform as we scale to serve thousands of concurrent users. Technical Challenges You'll SolveScalability Problems Design systems that gracefully handle traffic spikes during peak trading hours Implement efficient resource utilization across multiple AWS Availability Zones Optimize database queries and caching strategies for sub-second response times Build auto-scaling infrastructure that responds dynamically to user demand Real-Time Complexity Ensure consistent state management across distributed WebSocket connections Handle edge cases in real-time trading scenarios including network interruptions and race conditions Implement fair and accurate profit/loss calculations with precise timing requirements Design resilient systems that maintain data integrity during high-frequency updates Integration Challenges Build reliable connections to external financial data providers with proper error handling and circuit breakers Implement seamless user experience across desktop and web platforms Design APIs that support both real-time and historical data access patterns Create monitoring systems that provide actionable insights into system performance How to Apply Please submit your resume along with a cover letter describing your experience with real-time systems and distributed architectures. Include specific examples of scalable backend systems you've built and any experience with financial or gaming platforms. We're particularly interested in hearing about complex technical challenges you've solved and your approach to system design at scale. This role requires the ability to work in a fast-paced environment with changing requirements and the flexibility to adapt technical solutions as our platform evolves. We're looking for someone who thrives on technical challenges and is excited about building the next generation of financial education gaming platforms.
Posted 1 month ago
4.0 years
5 - 8 Lacs
Gurgaon
On-site
Work Flexibility: Hybrid Who we want: Who can Plan, implement, and maintain and migrate the software development infrastructure. Introduce and oversee software development automation across cloud services like Amazon Web Services (AWS) and Azure. Help develop, manage, and monitor continuous integration (CI) and continuous deployment (CD) pipelines. Excellent skills in applying Continuous Integration, Continuous Deployment and Continuous Delivery processes & tools [ Git, Maven, Jenkins, Ansible, Nagios, Apache Tomcat, Docker, etc.] Collaborate with software developers, QA specialists, and other DevOps team members to ensure timely and successful delivery of new software releases. Contribute to software design and development, including code review and feedback. Assist with troubleshooting and problem-solving when issues arise. Keep up with the latest industry trends and best DevOps practices while ensuring the company meets configuration requirements. Participate in team improvement initiatives. Help create and maintain internal documentation using Git or other similar version control applications. Provide on-call support as needed. What you will do: Ability to code and script in multiple programming languages and automation frameworks like Python , C# , Java , Perl , Ruby , SQL Server , NoSQL , and MySQL Understanding of the best security practices and automating security testing and updating in the CI/CD pipelines Ability to conveniently deploy monitoring and logging infrastructure using tools like Prometheus , Nagios , and Datadog Proficiency in container frameworks like Docker  and Kubernetes Mastery in the use of infrastructure as code (IaC) tools like Terraform  and Ansible  and command line interfaces (CLI) for Microsoft Azure,  Amazon AWS , and other cloud infrastructure platforms Creative and analytical thinker with strong problem-solving skills Must demonstrate exceptional verbal and written communication skills Must demonstrate ability to communicate effectively at all levels of the organization Previous experience working in a SCRUM or Agile environment preferred Previous software development experience is preferred What you need: Minimum Qualifications (Required): Bachelor’s degree in Computer Science, Computer Engineering or related field is preferred and minimum of 4+ years of experience as a Tools administrator or relevant experience is required. Preferred Qualifications (Strongly desired): Proven ability to design and implement new processes and facilitate user adoption. Excellent skills in applying Continuous Integration, Continuous Deployment and Continuous Delivery processes & tools [ Git, Maven, Jenkins, Ansible, Nagios, Apache Tomcat, Docker, etc.] Excellent skills in applying Continuous Integration, Continuous Deployment and Continuous Delivery with Focus on Automation Knowledge of Jama workflows, and permission schemes is preferred. Understanding of engineering lifecycle phases and design control processes is preferred. Experience validating third party tools to comply with Quality Management system and to follow FDA guidance is preferred. Knowledge of System Modeling, Requirement Management and Test Management best practices when working with multiple concurrent programs is preferred. Experience with software and hardware defect tracking processes is preferred. Ability to communicate and work with external software vendors is preferred. Ability to work independently and manage priorities on multiple tasks is required. Excellent verbal and written communication skills is required. Ability to work in a regulated environment in compliance to ISO 13485 and 21 CFR 820 is preferred. Travel Percentage: 10%
Posted 1 month ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
Remote
Company Description Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Job Description Key Responsibilities Design, build, and maintain scalable and secure relational and cloud-based database systems. Migrate data from spreadsheets or third-party sources into databases (PostgreSQL, MySQL, BigQuery). Create and maintain automated workflows and scripts for reliable, consistent data ingestion. Optimize query performance and indexing to improve data retrieval efficiency. Implement access controls, encryption, and data security best practices to ensure compliance. Monitor database health and troubleshoot issues proactively using appropriate tools. Collaborate with full-stack developers and data researchers to align data architecture with application needs. Uphold data quality through validation rules, constraints, and referential integrity checks. Keep up-to-date with emerging technologies and propose improvements to data workflows. Leverage tools like Python (Pandas, SQLAlchemy, PyDrive), and version control (Git). Support Agile development practices and CI/CD pipelines where applicable. Required Skills And Experience Strong SQL skills and understanding of database design principles (normalization, indexing, relational integrity). Experience with relational databases such as PostgreSQL or MySQL. Working knowledge of Python, including data manipulation and scripting (e.g., using Pandas, SQLAlchemy). Experience with data migration and ETL processes, including integrating data from spreadsheets or external sources. Understanding of data security best practices, including access control, encryption, and compliance. Ability to write and maintain import workflows and scripts to automate data ingestion and transformation. Experience with cloud-based databases, such as Google BigQuery or AWS RDS. Familiarity with cloud services (e.g., AWS Lambda, GCP Dataflow) and serverless data processing. Exposure to data warehousing tools like Snowflake or Redshift. Experience using monitoring tools such as Prometheus, Grafana, or the ELK Stack. Good analytical and problem-solving skills, with strong attention to detail. Team collaboration skills, especially with developers and analysts, and ability to work independently. Proficiency with version control systems (e.g., Git). Strong communication skills — written and verbal. Preferred / Nice-to-Have Skills Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience working with APIs for data ingestion and third-party system integration. Familiarity with CI/CD pipelines (e.g., GitHub Actions, Jenkins). Python experience using modules such as gspread, PyDrive, PySpark, or object-oriented design patterns. Experience in Agile/Scrum teams or working with product development cycles. Experience using Tableau and Tableau Prep for data visualization and transformation. Why Join Us Monthly long weekends — every third Friday off Wellness reimbursement to support your health and balance Paid parental leave Remote-first with flexibility and trust Work with a world-class data and marketing team inside a globally recognized brand Qualifications 5+ Years exp in Database Engineering. Additional Information Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves
Posted 1 month ago
0 years
2 - 3 Lacs
India
On-site
1. Building connections and prospects via LinkedIn and other sources 2. Source leads from multiple social medium 3. Cold calling to HR heads/CEO / 4. Demonstrate product via Zoom or another video meeting 5. Explain the benefits software application and encourage prospect to migrate to digital mode 6. Follow-up, convince and close the deal 7. Maintain relationship with clients 8. Work closely with digital marketing team and give various ideas for social media post to ensure the client engagement and conversion 9. Suggest product development team about the deviation, new feature development to enhance our customer experience 10. Offer client services and ensure the renewal of the services This position does not require field work. Desired Profile Candidate having experience software, ITES, enterprise, B2B sales are preferred. Candidates sold any product or services to HR professional are first preferred Should have good communication in English. Proficient in software and LinkedIn and other social platforms. Job Types: Full-time, Permanent Pay: ₹18,000.00 - ₹25,000.00 per month Schedule: Day shift Supplemental Pay: Performance bonus Work Location: In person
Posted 1 month ago
10.0 years
6 - 10 Lacs
Noida
On-site
Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 25-Jun-2025 Job ID 9042 Description and Requirements Position Summary We are seeking a forward-thinking and enthusiastic Engineering and Operations Specialist to manage and optimize our Splunk platforms. The ideal candidate will have in-depth experience in at least one of these technologies, with a preference for experience in MongoDB and other database is plus Job Responsibilities Worked with engineering and operational tasks for Splunk platforms, ensuring high availability and stability. Continuously improving the stability of the environment, leveraging automation, self-healing mechanisms. Develop and implement automation using technologies such as Ansible, Python, Shell. Install, configure, and maintain Splunk applications, indexers, search heads, and forwarders. Optimize search queries, configure data retention policies, and manage Splunk indexer storage to ensure optimal performance and resource utilization. Maintain and improve Splunk dashboard functionality and visualization for the Information Security department. Implement and maintain Splunk platform infrastructure and configuration. Develop reliable, efficient queries, summary indexes, and data models that will feed custom alerts and dashboards. Use and create dashboards and apps for platform auditing functions. Manage Apps/Dashboards for license usage and Application errors. Implements and maintains Splunk platform infrastructure and configuration Monitor the Splunk infrastructure for capacity planning and optimization. Maintain uniform Splunk dashboards across the organization. Migrate Splunk dashboards from superseded versions to current versions. Perform after-the-fact investigations utilizing Splunk capabilities. Ensure data quality is in line with the use cases and maintain current functional and technical knowledge of the Splunk platform. Mentor and guide other team members to understand the use case of Splunk. Provide regular support and guidance to a variety of teams on complex solutions and issue resolutions. Lead Proof-of-Concepts on Splunk implementation. Monitor and tune Splunk to optimize performance, identifying bottlenecks and troubleshooting issues. Analyze database queries, indexing, and storage to ensure minimal latency and maximum throughput. Splunk System Administrator will build, maintain, and standardize the Splunk platform, including forwarder deployment, configuration, dashboards, and maintenance across Linux OS. Perform application administration for a single security information management system. Other related functions as assigned. Able to debug production issues by analyzing the logs directly and using tools like Splunk. Work in Agile model with the understanding of Agile concepts and Azure DevOps. Learn new technologies based on demand and help team members by coaching and assisting. Education, Technical Skills & Other Critical Requirement Education Bachelor’s degree in computer science, Information Systems, or another related field with 10+ years of IT and Infrastructure engineering work experience. Splunk Certified Administrator is a plus Experience with cloud platforms like AWS, Azure, or Google Cloud. Experience (In Years) 10+ Years Total IT experience & 7+ Years relevant experience in Splunk Administrator Technical Skills In-depth experience with Splunk, with a preference for exposure to MongoDB are plus. Strong enthusiasm for learning and adopting new technologies. Must have experience with automation tools like Ansible, Python and Shell. Proficiency in CI/CD deployments, DevOps practices, and managing code repositories. Strong Knowledge of Infrastructure/Configuration as Code principles. Developer experience is highly desired. Data engineering skills are a plus. Working experience with other DB technologies and observability tools are a plus. Setting up Splunk Forwarding for new application tiers introduced into the environment. Strong knowledge in Debugging Splunk Forwarding on existing application tiers currently deployed. Manage Apps/Dashboards for license usage and Application errors. Must have extensive experience on Implements and maintains Splunk platform infrastructure and configuration. Monitor the Splunk infrastructure for capacity planning and optimization. Must be familiar with git best practices , repo management (Push, Branching , Pull requests ) , experience with managing or executing playbooks or cookbooks at scale. Must have work experience in Linux OS debugging skills Working experience in Elastic. Strong working knowledge of Splunk Search Processing Language (SPL), architecture and various components (indexer, forwarder, search head, deployment server). Splunk deployment experience; configuration of Splunk, forwarders, indexes, dashboards, search strings. The ability to perform onsite configuration and maintenance of Splunk deployments in Linux (On-perm) and cloud environments. Practical OS knowledge on Linux and Unix are necessary for constructing effective Splunk search strings. Experience Splunk migration and upgradation on Standalone Linux OS and Cloud platform is plus. Work experience in both Database and Splunk replication between Primary and Secondary servers to ensure high availability and fault tolerance. Managed Infrastructure security policy as per best industry standard by designing, configurating and implementing privileges and policy on databases using RBAC as well as Splunk. Scripting skills and automation experience using DevOps, Repos and Infrastructure as code. Working experience in Container (AKS and OpenShift) is plus. Working experience in Cloud Platform experience (Azure, Cosmos DB) is plus. Strong knowledge in ITSM process and tools (ServiceNow). Ability to work 24*7 rotational shift to support the Database and Splunk platforms. Other Critical Requirements Strong problem-solving abilities and a proactive approach to identifying and resolving issues. Excellent communication and collaboration skills. Ability to work in a fast-paced environment and manage multiple priorities effectively. Must have Leadership experience. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 1 month ago
0 years
1 - 3 Lacs
India
Remote
Job Summary: We are seeking a skilled and proactive Server and Security Administrator to manage, secure, and maintain our physical and virtual server infrastructure. The role requires deep technical expertise in server environments, network security, system hardening, and threat mitigation. You will be responsible for ensuring system uptime, implementing security best practices, and supporting infrastructure scalability. Key Responsibilities: Configure, manage, and maintain Linux/Windows-based servers (shared, VPS, or dedicated). Perform regular updates, patches, and server hardening for security and performance. Migrate and maintain WordPress websites across different hosting providers. Manage DNS settings, domains, and email hosting (cPanel, Plesk, or external providers like Google Workspace, Zoho Mail). Monitor server performance, uptime, and perform routine backups and restores. Set up and manage firewalls, malware protection, SSL certificates, and secure remote access (VPN, SSH). Respond to and investigate security alerts or breaches; apply preventive measures. Automate common server tasks using scripts (e.g., Bash, PowerShell, Python). Document system configurations, processes, and troubleshooting guidelines. Collaborate with development and support teams for deployment and system issues. Required Skills & Qualifications: Strong knowledge of Linux server environments (Ubuntu, CentOS, Debian) and/or Windows Server. Hands-on experience with shared hosting (cPanel, Plesk), VPS and cloud-based environments. Proficiency in DNS management and email configuration (MX records, SPF, DKIM, DMARC). Solid understanding of WordPress migrations, site performance tuning, and security best practices. Knowledge of server monitoring tools (e.g., UptimeRobot, Netdata, Nagios, or similar). Experience managing backups and disaster recovery solutions. Familiarity with SSL certificate management and web server configuration (Apache, Nginx). Working knowledge of cybersecurity fundamentals including firewalls, brute-force protection, etc. Good communication and documentation skills. Location: Kolkata Job Type: Permanent/Full-time Job Types: Full-time, Permanent Pay: ₹15,000.02 - ₹25,000.00 per month Benefits: Paid sick time Schedule: Day shift Evening shift Monday to Friday Morning shift Night shift Work Location: In person Expected Start Date: 27/06/2025
Posted 1 month ago
15.0 years
0 Lacs
Indore
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education
Posted 1 month ago
5.0 years
0 Lacs
India
On-site
Consultants should have the following skills & expertise: Certified in SuccessFactors Validated LMS(VLMS) Solution. Should have done at least 2 end to end implementations and 2 support projects. Excellent Communication & Presentation skills and must be a team player Excellent understanding of Learning Management process in Global organizations Define business requirements and perform fit gap analysisbetween client requirements and standard SuccessFactors LMS Solution. Expertise in providing Consulting Services to the Global organizations in HCM Best Practices and help clients to migrate to SAP HCM Cloud solutions. Expertise in creating custom reports in Plateau Report Designer Tool using sql, javascript and xml. Translate Businessrequirements into SystemConfiguration Objects and create SolutionDesign for SuccessFactors Validated LMS Solution (VLMS) in compliance with the Best Practices. Hands-on all the Data Models and excellent knowledge of XML. System configuration in accordance with Solution Design & Configuration Workbook / Business Blueprint Expertise in translations and must upload translation packs for data models configuration and MDF. Preparation & Execution of Test Cases / Test Plans / Test scripts Strong learningability - agilityand willingness to acquire new competencies and adapt quicklyto new tasks and environments. Must have Skills: Implementation Exposure in SF LMS Experience: BE /MBA with a sound industry experience of 5+ Years
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France