Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Spark and data warehousing solutions. - Strong understanding of data modeling and database design principles. - Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. - Experience in programming languages such as Python or Scala for data processing. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bengaluru office. - A 15 years full time education is required., 15 years full time education
Posted 1 week ago
2.0 - 6.0 years
6 - 9 Lacs
Hyderabad
Work from Office
OPENTEXT - THE INFORMATION COMPANY OpenText is a global leader in information management, where innovation, creativity, and collaboration are the key components of our corporate culture. As a member of our team, you will have the opportunity to partner with the most highly regarded companies in the world, tackle complex issues, and contribute to projects that shape the future of digital transformation. AI-First. Future-Driven. Human-Centered. At OpenText, AI is at the heart of everything we do powering innovation, transforming work, and empowering digital knowledge workers. Were hiring talent that AI cant replace to help us shape the future of information management. Join us. Your Impact An OpenText Content Server Consultant is responsible for the technical delivery of the xECM based solutions. Such delivery activities encompass development, testing, deployment and documentation of specific software components either providing extensions to specific items of core product functionality or implementing specific system integration components. This role has a heavy deployment and administration emphasis. Engagements are usually long term, but some relatively short ones requiring only specific services like an upgrade or a migration also happen. The nature of work may include full application lifecycle activities right from development, deployment/provisioning, testing, migration, decommissioning and ongoing run & maintain (upgrades, patching etc.) support. The role is customer facing and requires excellent interpersonal skills with the ability to communicate to a wide range of stake holders (internally and externally), both verbally and in writing. What the Role offers Work within an OpenText technical delivery team in order to Participate and contribute to deployment activities. Participate in the day to day administration of the systems, including Incident & Problem Management Participate in planning and execution of new implementations, upgrades and patching activities. Participate in the advanced configuration of ECM software components, in line with project and customer time scales. Actively contribute in automating provisioning, patching and upgrade activities where possible to achieve operational efficiencies. Perform code reviews and periodic quality checks to ensure delivery quality is maintained. Prepare, maintain and submit activity/progress reports and time recording/management reports in accordance with published procedures. Keep project managers informed of activities and alert of any issues promptly. Provide inputs as part of engagement closure on project learnings and suggest improvements. Utilize exceptional written and verbal communication skills while supporting customers via web, telephone, or email, while demonstrating a high level of customer focus and empathy. Respond to and solve customer technical requests, show an understanding of the customers managed hosted environment and applications within the Open Text enabling resolution of complex technical issues. Document or Implement proposed solutions. Respond to and troubleshoot alerts from monitoring of applications, servers and devices sufficient to meet service level agreements Collaborating on cross-team and cross-product technical issues with a variety of resources including Product support, IT, and Professional Services. What you need to succeed Well versed with deployment, administration and troubleshooting of the OpenText xECM platform and surrounding components (Content Server, Archive Center, Brava, OTDS, Search & Indexing) and integrations with SAP, SuccessFactors, Salesforce. Good experience/knowledge on following Experience working in an ITIL aligned service delivery organisation. Knowledge of Windows, UNIX, and Application administration skills in a TCP/IP networked environment. Experience working with relational DBMS (PostgreSQL/Postgres, Oracle, MS SQL Server, mySQL). Independently construct moderate complexity SQL s without guidance. Programming/scripting is highly desirable, (ie. Oscript, Java, JavaScript, PowerShell, Bash etc.) Familiarity with configuration and management of web/application servers (IIS, Apache, Tomcat, JBoss, etc.). Good understanding of object-oriented programming, Web Services, LDAP configuration. Experience in installing and configuring xECM in HA and knowledge in DR setup/drill. Experince in patching, major upgrades and data migration activities. Candidate should possess Team player Customer Focus and Alertness Attention to detail Always learning Critical Thinking Highly motivated Good Written and Oral Communication Knowledge sharing, blogs OpenTexts efforts to build an inclusive work environment go beyond simply complying with applicable laws. Our Employment Equity and Diversity Policy provides direction on maintaining a working environment that is inclusive of everyone, regardless of culture, national origin, race, color, gender, gender identification, sexual orientation, family status, age, veteran status, disability, religion, or other basis protected by applicable laws. . Our proactive approach fosters collaboration, innovation, and personal growth, enriching OpenTexts vibrant workplace.
Posted 1 week ago
0.0 - 4.0 years
25 - 27 Lacs
Bengaluru
Work from Office
Your opportunity Do you love the transformative impact data can have on a businessAre you motivated to push for results and overcome all obstaclesThen we have a role for you. What youll do Lead the building of scalable, fault tolerant pipelines with built in data quality checks that transform, load and curate data from various internal and external systems Provide leadership to cross-functional initiatives and projects. Influence architecture design and decisions. Build cross-functional relationships with Data Scientists, Product Managers and Software Engineers to understand data needs and deliver on those needs. Improve engineering processes and cross-team collaboration. Ruthlessly prioritize work to align with company priorities. Provide thought leadership to grow and evolve DE function and implementation of SDLC best practices in building internal-facing data products by staying up-to-date with industry trends, emerging technologies, and best practices in data engineering This role requires Experience in BI and Data Warehousing. Strong experience with dbt, Airflow and snowflake Experience with Apache Iceberg tables Experience and knowledge of building data-lakes in AWS (i.e. Spark/Glue, Athena), including data modeling, data quality best practices, and self-service tooling. Experience mentoring data professionals from junior to senior levels Demonstrated success leading cross functional initiatives Passionate about data quality, code quality, SLAs and continuous improvement Deep understanding of data system architecture Deep understanding of ETL/ELT patterns Development experience in at least one object-oriented language (Python,R,Scala, etc.). Comfortable with SQL and related tooling Bonus points if you have Experience with Observability Please note that visa sponsorship is not available for this position. Fostering a diverse, welcoming and inclusive environment is important to us. We work hard to make everyone feel comfortable bringing their best, most authentic selves to work every day. We celebrate our talented Relics different backgrounds and abilities, and recognize the different paths they took to reach us including nontraditional ones. Their experiences and perspectives inspire us to make our products and company the best they can be. We re looking for people who feel connected to our mission and values, not just candidates who check off all the boxes. . We believe in empowering all Relics to achieve professional and business success through a flexible workforce model. This model allows us to work in a variety of workplaces that best support our success, including fully office-based, fully remote, or hybrid. Our hiring process In compliance with applicable law, all persons hired will be required to verify identity and eligibility to work and to complete employment eligibility verification. Note: Our stewardship of the data of thousands of customers means that a criminal background check is required to join New Relic. We will consider qualified applicants with arrest and conviction records based on individual circumstances and in accordance with applicable law including, but not limited to, the San Francisco Fair Chance Ordinance . Headhunters and recruitment agencies may not submit resumes/CVs through this website or directly to managers. New Relic does not accept unsolicited headhunter and agency resumes, and will not pay fees to any third-party agency or company that does not have a signed agreement with New Relic. Candidates are evaluated based on qualifications, regardless of race, religion, ethnicity, national origin, sex, sexual orientation, gender expression or identity, age, disability, neurodiversity, veteran or marital status, political viewpoint, or other legally protected characteristics. Review our Applicant Privacy Notice at https: / / newrelic.com / termsandconditions / applicant-privacy-policy
Posted 1 week ago
14.0 - 17.0 years
50 - 55 Lacs
Hyderabad
Work from Office
As a Senior Application Architect, you would define and enhance the Risk Data solutions technology architecture and engineering. You would also be fully responsible for end-to-end detailed architecture for delivering the strategic initiatives for specific product(s). Key Responsibilities : Hands-on Expertise: You will need deep expertise in a diverse and complex tech stack, enabling you to solve challenging problems and guide engineering teams effectively. Your proficiency should include the following technologies: C#, Dotnet Core , Python, Angular JS, Visual Studio, IntelliJ, Postman, Azure Data Studio , SQL Server, ClickhouseDB, Selenium Web Driver Java, Apache JMeter, Azure Storage Explorer, Blob Storage, PowerBI Desktop, GitHub Desktop, Docker Desktop, WinMerge, Lens, kubectl, and Helm. Cloud Architecture Design and Implementation: Your role will involve designing and implementing scalable, secure cloud architectures tailored for various risk intelligence applications and systems. This requires a deep understanding of cloud technologies and the ability to align them with business needs. Collaboration: You will collaborate closely with enterprise architecture, IT, and business teams to comprehend their requirements and deliver customized cloud-based solutions. This collaboration is essential in ensuring that the solutions not only meet technical specifications but also align with strategic business goals. Security Compliance: Ensuring compliance with industry security standards and best practices is critical. You will implement robust security measures within the cloud architecture to protect sensitive data and maintain the integrity of systems. Technical Leadership and Mentorship: As a leader, you will provide technical guidance and mentorship to other team members. This includes leading by example, sharing knowledge, and fostering a culture of continuous learning and improvement. Resource Management: Efficiently managing cloud resources is vital. You will optimize the infrastructure to achieve maximum efficiency and cost-effectiveness, ensuring that the organization gets the best value from its cloud investments. Industry Trends and Technological Advancements: Staying updated with the latest industry trends and cloud technologies is crucial. You will continuously monitor advancements in the field to incorporate cutting-edge solutions and maintain the competitive edge of the organization. Your role as a Senior Cloud Architect is pivotal in shaping the technology landscape of this business unit, driving innovation, and ensuring that the solutions delivered are robust, secure, and aligned with business objectives. About Swiss Re Swiss Re is one of the world s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 134635
Posted 1 week ago
3.0 - 8.0 years
13 - 14 Lacs
Hyderabad
Work from Office
OPENTEXT - THE INFORMATION COMPANY OpenText is a global leader in information management, where innovation, creativity, and collaboration are the key components of our corporate culture. As a member of our team, you will have the opportunity to partner with the most highly regarded companies in the world, tackle complex issues, and contribute to projects that shape the future of digital transformation. AI-First. Future-Driven. Human-Centered. At OpenText, AI is at the heart of everything we do powering innovation, transforming work, and empowering digital knowledge workers. Were hiring talent that AI cant replace to help us shape the future of information management. Join us. Your Impact An OpenText Content Server Consultant is responsible for the technical delivery of the xECM based solutions. Such delivery activities encompass development, testing, deployment and documentation of specific software components either providing extensions to specific items of core product functionality or implementing specific system integration components. This role has a heavy deployment and administration emphasis. Engagements are usually long term, but some relatively short ones requiring only specific services like an upgrade or a migration also happen. The nature of work may include full application lifecycle activities right from development, deployment/provisioning, testing, migration, decommissioning and ongoing run & maintain (upgrades, patching etc.) support. The role is customer facing and requires excellent interpersonal skills with the ability to communicate to a wide range of stake holders (internally and externally), both verbally and in writing. What the Role offers Work within an OpenText technical delivery team in order to Participate and contribute to deployment activities. Participate in the day to day administration of the systems, including Incident & Problem Management Participate in planning and execution of new implementations, upgrades and patching activities. Participate in the advanced configuration of ECM software components, in line with project and customer time scales. Actively contribute in automating provisioning, patching and upgrade activities where possible to achieve operational efficiencies. Perform code reviews and periodic quality checks to ensure delivery quality is maintained. Prepare, maintain and submit activity/progress reports and time recording/management reports in accordance with published procedures. Keep project managers informed of activities and alert of any issues promptly. Provide inputs as part of engagement closure on project learnings and suggest improvements. Utilize exceptional written and verbal communication skills while supporting customers via web, telephone, or email, while demonstrating a high level of customer focus and empathy. Respond to and solve customer technical requests, show an understanding of the customers managed hosted environment and applications within the Open Text enabling resolution of complex technical issues. Document or Implement proposed solutions. Respond to and troubleshoot alerts from monitoring of applications, servers and devices sufficient to meet service level agreements Collaborating on cross-team and cross-product technical issues with a variety of resources including Product support, IT, and Professional Services. What you need to succeed Well versed with deployment, administration and troubleshooting of the OpenText xECM platform and surrounding components (Content Server, Archive Center, Brava, OTDS, Search & Indexing) and integrations with SAP, SuccessFactors, Salesforce. Good experience/knowledge on following Experience working in an ITIL aligned service delivery organisation. Knowledge of Windows, UNIX, and Application administration skills in a TCP/IP networked environment. Experience working with relational DBMS (PostgreSQL/Postgres, Oracle, MS SQL Server, mySQL). Independently construct moderate complexity SQL s without guidance. Programming/scripting is highly desirable, (ie. Oscript, Java, JavaScript, PowerShell, Bash etc.) Familiarity with configuration and management of web/application servers (IIS, Apache, Tomcat, JBoss, etc.). Good understanding of object-oriented programming, Web Services, LDAP configuration. Experience in installing and configuring xECM in HA and knowledge in DR setup/drill. Experince in patching, major upgrades and data migration activities. Candidate should possess Team player Customer Focus and Alertness Attention to detail Always learning Critical Thinking Highly motivated Good Written and Oral Communication Knowledge sharing, blogs
Posted 1 week ago
2.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Every career journey is personal. Thats why we empower you with the tools and support to create your own success story. Be challenged. Be heard. Be valued. Be you ... be here. Job Summary The AI Operations Specialist, 2 is responsible for designing and implementing productionized Artificial Intelligence (AI) solutions to solve business problems. This role works closely with our data science teams and other stakeholders to enable the integration of AI/ML models into business processes. These solutions need to be scalable, resilient, and secure. This role also maintains CI/CD pipelines, productionizes models (ML/DL/LLM), and develops integration code needed to deploy AI solutions. Essential Job Functions Design and implement productionized model (ML/DL/LLM) solutions that are scalable, resilient, and secure. Evaluate and optimize data science methodology needs while meeting the non-functional requirements of the business process. Monitor and maintain the performance of deployed models. Monitor production performance and provide recommendations for maximizing ML/LLM configurations and performance. Support tool and platform administrators in maintaining the health and functionality of the ecosystem. Create and manage release pipelines for data science teams that facilitate Continuous Integration (CI), and Continuous Deployment (CD). Automate build and deployment procedures to streamline delivery. Schedule and validate all production deployments. Partner with Release Management, Infrastructure, DevOps, etc. to ensure a smooth and successful deployment. Collaborate with different teams to implement models and monitor outcomes. Provide updates to stakeholders on status of deployments and any risks and issues. Keep up to date with the latest technology trends. Continuously improve models and techniques to adapt to new data patterns and trends. Minimum Qualifications Bachelor s Degree in Computer Science, Engineering, or related field of study. 2+ years experience in AI Operations, Machine Learning Engineering, Data Science, Data Engineering, DevOps, Analytics, or related fields, with a focus in retail financial services or information technology fields utilizing agile methodologies. Preferred Qualifications 5+ years of experience in AI Operations, Machine Learning Engineering, Data Science, Data Engineering, DevOps, Analytics, or related fields. Skills DevOps ServiceNow Platform Azure Devops MLflow Apache Airflow Application Programming Interface (API) Docker (Software) GitHub Data Analysis Microsoft Excel Agile Environments JIRA Tool Reports To : Manager and above Direct Reports : 0 Work Environment Normal office environment (Hybrid), 6 to 8 days per month are required in the office. Travel Ability to travel up to 5% annually Other Duties This job description is illustrative of the types of duties typically performed by this job. It is not intended to be an exhaustive listing of each and every essential function of the job. Because job content may change from time to time, the Company reserves the right to add and/or delete essential functions from this job at any time. About Bread Financial At Bread Financial, you ll have the opportunity to grow your career, give back to your community, and be part of our award-winning culture. We ve been consistently recognized as a best place to work nationally and in many markets and we re proud to promote an environment where you feel appreciated, accepted, valued, and fulfilled both personally and professionally. Bread Financial supports the overall wellness of our associates with a diverse suite of benefits and offers boundless opportunities for career development and non-traditional career progression. Bread Financial (NYSE: BFH) is a tech-forward financial services company that provides simple, personalized payment, lending, and saving solutions to millions of U.S consumers. Our payment solutions, including Bread Financial general purpose credit cards and savings products, empower our customers and their passions for a better life. Additionally, we deliver growth for some of the most recognized brands in travel & entertainment, health & beauty, jewelry and specialty apparel through our private label and co-brand credit cards and pay-over-time products providing choice and value to our shared customers. To learn more about Bread Financial, our global associates and our sustainability commitments, visit breadfinancial.com or follow us on Instagram and LinkedIn . All job offers are contingent upon successful completion of credit and background checks. Bread Financial is an Equal Opportunity Employer. Job Family: Data and Analytics Job Type: Regular
Posted 1 week ago
7.0 - 12.0 years
37 - 45 Lacs
Hyderabad
Work from Office
Position Summary Resource is responsible for assisting MetLife Docker Container support of Application Development Teams. In this position the resource will be supporting MetLife applications in an operational role performing on boarding applications, troubleshooting infra and Applications Container issues. Automate any of the manual build process using CI/CD pipeline. Job Responsibilities Development and maintenance in operational condition of OpenShift, Kubernetes Orchestration container platforms Experience in workload migration from Docker to OpenShift platform Manage the container platform ecosystem (installation, upgrade, patching, monitoring) Check and apply critical patches in OpenShift/Kubernetes Troubleshoot issues in OpenShift Clusters Experience in OpenShift implementation, administration and support Working experience in OpenShift and Docker/K8s Knowledge of CI/CD methodology and tooling (Jenkins, Harness) Experience with system configuration tools including Ansible, Chef Cluster maintenance and administration experience on OpenShift and Kubernetes Strong Knowledge & Experience in RHEL Linux Manage OpenShift Management Components and Tenants Participates as part of a technical team responsible for the overall support and management of the OpenShift Container Platform. Learn new technologies based on demand. Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelors degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in development and maintenance in operational condition of OpenShift, Kubernetes Orchestration container platforms Experience in installation, upgrade, patching, monitoring of container platform ecosystem Experience in workload migration from Docker to OpenShift platform. Good knowledge of CI/CD methodology and tooling (Jenkins, Harness) Linux Administration Software Defined Networking (Fundamentals) Container Runtimes (Podman / Docker), Kubernetes (OpenShift) / Swarm Orchestration, GoLang framework and Microservices Architecture Knowledge and usage of Observability tools (i.e. Elastic, Grafana, Prometheus, OTEL collectors, Splunk ) Apache Administration Automation Platforms: Specifically, Ansible (roles / collections) SAFe DevOps Scaled Agile Methodology Scripting: Python, Bash Serialization Language: YAML, JSON Knowledge and usage of CI/CD Tools (i.e.: AzDO, ArgoCD) Reliability Mgmt. / Troubleshooting Collaboration & Communication SkillsContinuous Integration / Continuous Delivery (CI/CD) Experience in creating change tickets and working on tasks in Service Now Java Mgmt. (JMX)/ NodeJS management Other Requirements (licenses, certifications, specialized training if required)
Posted 1 week ago
5.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
ql-editor "> Senior Site Reliability Engineer - JD As a Senior Site Reliability Engineer (SRE) , you will collaborate closely with our Development and IT teams to ensure the reliability, scalability, and performance of our applications. You will take ownership of setting and maintaining service-level objectives (SLOs), building robust monitoring and alerting, and continually improving our infrastructure and processes to maximize up time and deliver exceptional customer experience. This role operates at the intersection of development and operations, reinforcing best practices, automating solutions, and reducing toil across systems and platforms. About QualMinds: QualMinds is a global technology company dedicated to empowering clients on their digital transformation journey. We help our clients to design & develop world-class digital products, custom softwares and platforms. Our primary focus is delivering enterprise grade interactive software applications across web, desktop, mobile, and embedded platforms. Responsibilities: 1. Ensure Reliability & Performance : Own the observability of our systems, ensuring they meet established service-level objectives (SLOs) and maintain high availability. 2. Cloud & Container Orchestration : Deploy, configure, and manage resources on Google Cloud Platform (GCP) and Google Kubernetes Engine (GKE), focusing on secure and scalable infrastructures. 3. Infrastructure Automation & Tooling : Set up and maintain automated build and deployment pipelines; drive continuous improvements to reduce manual work and risks. 4. Monitoring & Alerting : Develop and refine comprehensive monitoring solutions (performance, uptime, error rates, etc.) to detect issues early and minimize downtime. 5. Incident Management & Troubleshooting : Participate in on-call rotations; manage incidents through resolution, investigate root causes, and create blameless postmortems to prevent recurrences. 6. Collaboration with Development : Partner with development teams to design and release services that are production-ready from day one, emphasizing reliability, scalability, and performance. 7. Security & Compliance : Integrate security best practices into system design and operations; maintain compliance with SOC 2 and other relevant standards. 8. Performance & Capacity Planning : Continuously assess system performance and capacity; propose and implement improvements to meet current and future demands. 9. Technical Evangelism : Contribute to cultivating a culture of reliability through training, documentation, and mentorship across the organization. Requirements : Bachelor s degree in Computer Science, Business Administration, or relevant work experience. A minimum of 5+ years in an SRE, DevOps, or similar role in an IT environment, required . Hands-on experience with Microsoft SQL Clusters, Elasticsearch, Kubernetes, required . Deep familiarity with Windows or Linux environments and .NET or PHP stack applications, including IIS/Apache, SQL Server/MySQL, etc. Strong understanding of networking, firewalls, intrusion detection, and security best practices. Proven administrative experience with tools like GIT, TFS, Bitbucket, and Bamboo for continuous Integration, Delivery, and Deployment. Knowledge of automation testing tools such as SonarQube, Selenium, or comparable technologies. Experience with performance profiling, logging, metrics collection, and alerting tools. Competence in debugging solutions across diverse environments. Hands-on experience with GCP, AWS, or Azure, container orchestration (Kubernetes), and microservices-based architectures. Understanding of authentication, authorization, OAUTH, SAML, encryption (public/private key, symmetric, asymmetric), token validation, and SSO. Familiarity with security strategies to optimize performance while maintaining compliance (e.g., SOC 2). Willingness to participate in an on-call rotation and respond to system emergencies 24/7 when necessary. Monthly weekend rotation for Production Patching. A+, MCP, Dell certifications and Microsoft office expertise are a plus!
Posted 1 week ago
7.0 - 12.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Job Title: WebLogic Technical Lead Location: Banglaore , Department: IT Infrastructure / Middleware Services Job Summary: We are looking for an experienced WebLogic Technical Lead to oversee the design, implementation, optimization, and support of Oracle WebLogic environments in a high-availability enterprise setup. The ideal candidate will lead a team of middleware engineers, provide technical direction, and ensure the reliability, security, and scalability of application server platforms. Key Responsibilities: Lead the design, deployment, configuration, tuning, and support of Oracle WebLogic Server (11g, 12c, 14c) environments across development, testing, and production. Manage WebLogic domain architecture, including admin servers, managed servers, node managers, and cluster configurations. Oversee application deployments (EAR, WAR, JAR) and support Java EE applications in standalone and clustered environments. Drive JVM tuning, memory leak analysis, thread dump analysis, and performance tuning of WebLogic instances. Manage WebLogic patching, version upgrades, and JDK migrations with zero/minimal downtime. Integrate WebLogic with Oracle HTTP Server (OHS), Apache, load balancers, and SSL certificates. Collaborate with application, security, and network teams for system hardening, SSL/TLS configurations, and vulnerability remediation. Lead incident resolution and root cause analysis for critical issues affecting middleware availability and performance. Automate routine tasks using shell scripting, WLST (WebLogic Scripting Tool), or Ansible (preferred). Support disaster recovery (DR) setup and participate in DR drills for middleware environments. Guide and mentor junior team members and enforce best practices for configuration management, change control, and documentation. Contribute to architectural decisions involving middleware modernization and potential migration to cloud-native platforms. Required Qualifications & Skills: Bachelor s degree in Computer Science, Engineering, or related technical field. 7+ years of experience in middleware administration with at least 3+ years in a WebLogic Lead or senior role. Strong hands-on experience with: Oracle WebLogic Server 11g/12c/14c WebLogic clustering, high availability, and load balancing JMS, JDBC data sources, connection pooling, and tuning SSL, keystores, truststores, and certificate management Shell scripting, WLST scripting, log analysis Deep understanding of Java EE architecture, deployment models, and application lifecycle on WebLogic. Proficient in incident and change management, preferably under ITIL frameworks. Experience supporting WebLogic in RAC-integrated environments. Familiarity with monitoring tools (OEM, Prometheus/Grafana, AppDynamics, etc.). Preferred Qualifications: Exposure to Kubernetes, Docker, or cloud-based middleware services (OCI, AWS, Azure). Knowledge of DevOps tools (Git, Jenkins, Ansible) and CI/CD for middleware. Oracle certifications (OCA/OCP) in WebLogic are a plus. Soft Skills & Attributes: Strong leadership, team coordination, and stakeholder communication skills. Ability to prioritize and manage multiple initiatives under tight deadlines. Analytical mindset with a strong focus on proactive monitoring and resilient design. Willingness to provide 24x7 support for critical production issues and on-call rotation.
Posted 1 week ago
3.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Description Company Description Genisup India Private Limited is a semiconductor and system design company based in Bangalore. We specialize in Foundational IP Fabless Design, Semiconductor & Product Engineering, and IoT solutions. This position is for one of our global semiconductor clients, focused on developing cutting-edge EDA infrastructure tools. Role Description Genisup is seeking a talented UI/UX Consultant to join our Bangalore-based CAD team. The selected candidate will be responsible for developing and maintaining web-based applications that support EDA tool license forecasting, utilization tracking, and other CAD infrastructure utilities. Responsibilities Design and maintain web applications for the CAD team supporting VLSI design workflows. Implement intuitive and responsive UI/UX using HTML/CSS, JavaScript, Django, Apache, and SQL. Develop and enhance the FAT (Forecast and Tracking) web application for EDA license tracking and infrastructure reporting (e.g., LSF job statistics, server farm utilization). Manage bug tracking and feature enhancements for FAT and other GUI-based tools. Qualifications Experience: 3 to 6 years in web application development. Technical Skills: Strong expertise in Python/Django development. Experience with PostgreSQL/MySQL database setup and management. Proficiency in HTML/CSS and Apache server configuration. Familiarity with Linux systems and EDA license management (FLEXLM) is a plus. Understanding of CAD/IT infrastructure such as IBM Platform LSF and Altair RTDA is advantageous. Experience with version control tools like Perforce is a bonus. Exposure to AI/ML, data analytics, and automation using Perl, Python, and Shell scripting is a strong advantage. Academics Bachelor s or Master s degree (BE/B.Tech/ME/M.Tech) in Computer Science (CS), Electronics & Communication Engineering (ECE), or related fields.
Posted 1 week ago
3.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Spark and data warehousing solutions. - Strong understanding of data modeling and database design principles. - Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. - Experience in programming languages such as Python or Scala for data processing. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bengaluru office. - A 15 years full time education is required., 15 years full time education
Posted 1 week ago
5.0 - 7.0 years
3 - 7 Lacs
Ahmedabad
Work from Office
Lead and mentor a team of Linux system administrators, assigning tasks and monitoring performance. Design, deploy, maintain, and optimize Linux-based infrastructure (RedHat, CentOS, Oracle Linux, Ubuntu). Manage critical services such as Apache, Nginx, MySQL/MariaDB, etc. Configure and maintain monitoring tools (e.g., Nagios, Zabbix, Prometheus, Grafana). Implement and enforce security practices: patching, hardening, firewalls (iptables/nftables), SELinux. Oversee backup and disaster recovery processes. Plan and execute migrations, upgrades, and performance tuning. Collaborate with cross-functional teams (Network, DevOps, Development) to support infrastructure needs. Define and document policies, procedures, and best practices. Respond to incidents and lead root cause analysis for system outages or degradations. Maintain uptime and SLAs for production environments. Experience with virtualization (KVM, VMware, Proxmox) and cloud platforms (AWS, GCP, Azure, or private cloud). Solid understanding of TCP/IP, DNS, DHCP, VPN, and other network services Hands-on experience working on Firewall like Sophos, fortigate Strong problem-solving and incident management skills.
Posted 1 week ago
3.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of data pipeline architecture and design. - Experience with ETL processes and data integration techniques. - Familiarity with data warehousing concepts and technologies. - Knowledge of data quality frameworks and best practices. Additional Information: - The candidate should have minimum 7.5 years of experience in Apache Spark. - This position is based in Chennai. - A 15 years full time education is required., 15 years full time education
Posted 1 week ago
6.0 - 10.0 years
14 - 15 Lacs
Ahmedabad
Work from Office
Jobs At CCIL - ccil - The Clearing Corporation of India Limited Jobs At CCIL - ccil Job Description of : Techno Functional Project Manager Ghandinagar Gift City Job Title : Deputy Manager II Department : Information Technology Reports To : Sr. Manager Experience : 6 - 10 Years Preferred Qualification : B.E./B Tech Required Qualification : B.E./B.Tech/BSc (with PG Dip Computers)/ BSc (IT) / MCA Skill, Knowledge & Trainings : At least 3 years of experience in managing Trading or Settlement applications with solution delivery through all phases of the project. Basic understanding of project management principles and methodologies. Previous experience on trading/payment systems analysis, design and development on front end technology namely Java, Apache, PostgreSQL and Linux OS based systems. Familiar with Payment Systems and Trading technologies development, internet and consumer trends. Willing to relocate to GIFT City, Gandhinagar, if required. Skills: Experience working with Web Development / Integration Projects built on the Java stack. Ability to effectively manage multiple projects across various workstreams. Serve as the primary point of contact for business stakeholders and third-party vendors. Skilled in collaborating with designers, developers, and product teams to achieve project goals. Demonstrates the ability to work independently, with strong problem-solving and analytical skills. Self-motivated with excellent interpersonal skills, a positive attitude, and a proactive, go-getter mindset. Knowledge of Information Technology Infrastructure, Network Architecture. Knowledge of Trading Systems and Payment Systems. Fundamental understanding of Application Security and Secure Coding Practices. Knowledge of ISO20022 standard. Carry out periodic inspections of the project along with all the stakeholders including business and 3rd party vendor. Be the Single Point of Contact for the business and 3rd party vendor. Build and maintain relationships with business and 3rd party vendor team members. Will be responsible for project goals, deliverables, schedule, budget and resources. Manage and Support Payment/Settlement development projects through all its phases, including design, planning, build and test, deployment and transition to maintenance. Core Competencies : Project Management Application Development Application Support Functional Competencies : Payment Systems Capital Markets Job Purpose : Participate in requirement gathering, analysis and freezing the requirement. Collaborating with team members across different IT domains (eg: developers, business team, infrastructure) to achieve project objectives. Working closely with Business and development vendor to maintain roadmaps, product backlogs and establish priorities. Assisting in the testing and implementation stages of project cycles. Identify dependencies in Integration or standalone projects and mitigate the risk. Track and report on project milestones and provide status reports to management. IT SPOC role is to oversee , execute and ensure the successful delivery and management of single or multiple projects within scope , quality , time and cost constraints that may be clearly defined or may require dynamic change management to deliver business value . Area of Operations : Working in technology projects with cross functional teams to achieve project milestones within defined timelines and deliver high quality results. Key Responsibility : 1. Meeting with business users, understanding the business requirement 2. Production Support and Change Implementation for Projects assigned 3. Assisting in planning, coordinating and managing IT projects from inception to completion under the supervision of Senior Project Manager. Any Other Requirement : Excellent communication skills, both verbal and written. Personal Attributes The incumbent must demonstrate the following personal attributes: Good Team Player, ability to take responsibility of work assigned and ready to learn new technologies.
Posted 1 week ago
3.0 - 8.0 years
10 - 12 Lacs
Ahmedabad
Work from Office
Jobs At CCIL - ccil - The Clearing Corporation of India Limited Jobs At CCIL - ccil Job Description of : Application & Functional Support -Ghandinagar Gift City Job Title : Assistant Manager II /Deputy Manager I Department : Information Technology Reports To : Sr. Manager Experience : 3-8 Years Preferred Qualification : BE/B Tech Required Qualification : B.E./B Tech/BSc (with PG Dip Computers)/ BSc (IT) / MCA Skill, Knowledge & Trainings : Skills: Mandatory: The Application Support Engineer will work in shifts (excluding night shifts). Must have hands-on experience in deploying and supporting applications built on the Java stack in a Linux environment. Proficient in working with PostgreSQL database technology. Experienced in deploying web applications on Apache web servers. Strong hands-on experience with the Linux operating system. Practical experience with IBM WebSphere MQ Series (both MQ Server and Client) is required. Knowledge: The incumbent must possess strong proficiency in the following areas: Spring Boot, Spring Framework, Java, REST APIs, and React JS. Basic troubleshooting skills for both Windows and Linux operating systems. Ability to provide first-level support for LAN and WAN networks. Capable of accurately understanding and diagnosing technical issues over the phone and delivering appropriate support. A good understanding of ISO 20022 standards and payment systems is a plus. Core Competencies : As per Knowledge and Skills given above Functional Competencies : Banking / KYC Job Purpose : To provide application support and maintenance for Java-based applications, along with offering member support via phone and email. Area of Operations : Application Support and Maintenance Provide both technical and functional support to member banks via telephone and email Key Responsibility : Application Deployment & Maintenance: Responsible for the deployment and ongoing maintenance of assigned projects. Production Support: Ensure production support for payment systems by performing the following tasks: Beginning of Day (BOD) activities Intra-day application monitoring End of Day (EOD) activities Patch upgrades Web server installation Troubleshooting and analysis Maintaining production Standard Operating Procedures (SOPs) and deploying production releases Investigating user queries through database queries (SQL), application log file analysis, order and trade issues, and application flow interruptions Team Liaison: Collaborate with other teams, including Network, Infrastructure, and System Administration. Member Bank Support: Provide both technical and functional support to member banks via telephone and email. User Coordination: Coordinate with users to resolve issues and ensure smooth operations. Any Other Requirement : WORKING CONDITIONS Should be ready to work in pressure situations and in Shifts (No night shift). Personal Attributes The incumbent must demonstrate the following personal attributes: Good Team Player, ability to take responsibility of work assigned and ready to learn new technologies.
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. Cloud Engineer for our public cloud operations team will focus on automating governance policies in our cloud environments. The goal is to enable self-service wherever possible without compromising security. The team is responsible for partnering with multiple stakeholders in framing and implementing governance policy frameworks for Cloud platforms primarily on AWS and also on OCI and GCP. What You’ll Be Doing... Governance team is primarily responsible for ensuring that the data and processes that are used in public cloud platforms are secured and controlled so that application workloads in those cloud platforms are not exposed to unintended users or services. Governance includes implementation of strict policies for managing users, roles, permissions and accounts, and ensuring enforcement and compliance of those policies, visibility into who is doing what and auditing what changes were made to the environment. Another aspect of Governance is periodic audit on resource utilization and terminating services that are under-utilized or non-compliant to organizational standards Designing and automate governance framework across our cloud environments with an emphasis on AWS. Automating & Maintaining Cloud Governance WebPortal to allow application and infra teams to generate reports and raise exception requests. Monitoring, logging, audits and automated policy enforcement for security and cost compliance. Ensuring services availability and continuity through proper response to incidents and requests. What We’re Looking For... You’ll need to have: Bachelor’s degree or one or more years of work experience. Core experience and knowledge on Python and django frameworks, Html, Angular and scripting languages. Good experience with Apache web server with Linux OS. One or more years of experience in building cloud platform architecture solutions on public and/or private cloud platforms with an emphasis towards governance/security tools. Hands-on knowledge in core AWS services like EC2, S3, EBS, ELB, AWS Lambda, CLI etc. and familiarity with AWS network services. Even Better To Have Master's degree. Cloud Certification. Experience in infrastructure and Cloud services with proficiency in automation using Python, ReactJS, Unix Shell and other scripting languages. Experience with modern source control repositories (e. g. Git) and devOps toolsets (Jenkins/ Ansible etc) and familiarity with Agile/ Scrum methodologies. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Location HYDERABAD OFFICE INDIA Job Description We’re looking for a Platform Engineer to join our Data & Analytics team. We are searching for self-motivated candidates, who will play a vital role in enabling self-serve usage of Databricks Unity Catalog features at P&G at scale of 200+ applications. Responsibilities: Conducting analysis and experiments within the Databricks ecosystem. Implementing and maintaining data governance best practices, focusing on data security and access controls. Collaborating with business and semi-technical collaborators to understand requirements and develop solutions using Azure and Databricks. Working closely with Data Engineers to understand their technical needs related to Unity Catalog and propose effective solutions for data processing. Knowing the latest advancements in Databricks and data engineering technologies and testing new Databricks features. Participating in data architecture and design discussions, sharing insights and recommendations. Testing architect-defined patterns and providing feedback based on implementation experiences. Leading Databricks Unity Catalog objects using Terraform. Building solutions in Terraform and Java (APIs) to deploy Delta Sharing and Lakehouse federation at scale. Developing scalable solutions, guidelines, principles, and standard methodologies for multiple clients. Job Qualifications At least 3 years of hands-on experience working with Databricks. Experience implementing projects and solutions in the cloud (Azure preferred). Bachelor's degree or equivalent experience in Computer Science, Data Engineering, or a related field. Experience in Data Engineering: data ingestion, modeling, and pipeline development. Familiarity with data engineering best practices, including query optimization. Proficiency in SQL and Python. Experience with Terraform. Familiarity with Big Data/ETL processes (Apache Spark). Knowledge of crafting and implementing REST APIs. Experience with CI/CD practices and Git. Tight-knit teamwork and interpersonal skills. About Us We produce globally recognized brands and we grow the best business leaders in the industry. With a portfolio of trusted brands as diverse as ours, it is paramount our leaders are able to lead with courage the vast array of brands, categories and functions. We serve consumers around the world with one of the strongest portfolios of trusted, quality, leadership brands, including Always®, Ariel®, Gillette®, Head & Shoulders®, Herbal Essences®, Oral-B®, Pampers®, Pantene®, Tampax® and more. Our community includes operations in approximately 70 countries worldwide. Visit http://www.pg.com to know more. We are an equal opportunity employer and value diversity at our company. We do not discriminate against individuals on the basis of race, color, gender, age, national origin, religion, sexual orientation, gender identity or expression, marital status, citizenship, disability, HIV/AIDS status, or any other legally protected factor. "At P&G, the hiring journey is personalized every step of the way, thereby ensuring equal opportunities for all, with a strong foundation of Ethics & Corporate Responsibility guiding everything we do. All the available job opportunities are posted either on our website - pgcareers.com, or on our official social media pages, for the convenience of prospective candidates, and do not require them to pay any kind of fees towards their application.” Job Schedule Full time Job Number R000134919 Job Segmentation Experienced Professionals (Job Segmentation)
Posted 1 week ago
1.0 - 2.0 years
1 - 5 Lacs
Hyderabad
Work from Office
Develop and maintain web applications using PHP and related technologies. Work closely with front-end developers and design teams to integrate user-facing elements with server-side logic. Troubleshoot, test, and maintain the core product software to ensure strong optimization and functionality. Identify and resolve bottlenecks, fix bugs, and improve application performance. Write well-designed, testable, and efficient code by adhering to best practices. Requirements: Proficiency in PHP: Experience with PHP frameworks like Laravel, Symfony, or CodeIgniter. Knowledge of front-end technologies: HTML5, CSS3, JavaScript, jQuery. Experience with databases: MySQL, PostgreSQL, or MariaDB. Familiarity with version control systems: Git, GitHub, GitLab. Understanding of RESTful APIs and JSON: Experience with third-party API integration. Experience with LAMP stack: Linux, Apache, MySQL, PHP. Familiarity with content management systems: WordPress, Drupal is a plus. Knowledge of cloud platforms: AWS, Google Cloud. Problem-solving skills: Strong debugging and troubleshooting abilities. Understanding of security best practices.
Posted 1 week ago
4.0 - 9.0 years
6 - 10 Lacs
Gurugram
Work from Office
About Mindera At Mindera , we craft software with people we love. We re a collaborative, global team of engineers who value open communication, great code, and building impactful products. We re looking for a talented C#/.NET Developer to join our growing team in Gurugram and help us build scalable, high-quality software systems. What You ll Do Build, maintain, and scale robust C#/.NET applications in a fast-paced Agile environment. Work closely with product owners and designers to bring features to life. Wr
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Requirements Possess good experience developing Java EE web applications and Java UI frameworks. Certifications in Java, Java EE, etc, preferred. Some project leading experience in IT will be an added advantage. 3-5 years of experience in systems analysis, design, programming with knowledge of one or more of the following technologies: Java, Java EE, Struts, JSF, Spring, EJB, Hibernate, IBatis, Weblogic Application Server, Elixir, Oracle DB, LDAP, MQ Series, JBoss Application Server, Apache Tomcat, ESB, Oracle Database, SQL, PL/SQL. This job was posted by Kanika B from Applore Technologies.
Posted 1 week ago
2.0 - 7.0 years
6 - 7 Lacs
Bengaluru
Work from Office
Draup - Multi-Dimensional Global Labor & Market Data Data Analyst - Tech Job Summary We are looking for a highly skilled Big Data & ETL Tester to join our data engineering and analytics team. The ideal candidate will have strong experience in PySpark , SQL , and Python , with a deep understanding of ETL pipelines , data validation , and cloud-based testing on AWS . Familiarity with data visualization tools like Apache Superset or Power BI is a strong plus. You will work closely with our data engineering team to ensure data availability, consistency, and quality across complex data pipelines, and help transform business requirements into robust data testing frameworks. Key Responsibilities Collaborate with big data engineers to validate data pipelines and ensure data integrity across ingestion, processing, and transformation stages. Write complex PySpark and SQL queries to test and validate large-scale datasets. Perform ETL testing , covering schema validation, data completeness, accuracy, transformation logic, and performance testing. Conduct root cause analysis of data issues using structured debugging approaches. Build automated test scripts in Python for regression, smoke, and end-to-end data testing. Analyze large datasets to track KPIs and performance metrics supporting business operations and strategic decisions. Work with data analysts and business teams to translate business needs into testable data validation frameworks. Communicate testing results, insights, and data gaps via reports or dashboards (Superset/Power BI preferred). Identify and document areas of improvement in data processes and advocate for automation opportunities. Maintain detailed documentation of test plans, test cases, results, and associated dashboards. Required Skills and Qualifications 2+ years of experience in big data testing and ETL testing . Strong hands-on skills in PySpark , SQL , and Python . Solid experience working with cloud platforms , especially AWS (S3, EMR, Glue, Lambda, Athena, etc.) . Familiarity with data warehouse and lakehouse architectures. Working knowledge of Apache Superset , Power BI , or similar visualization tools. Ability to analyze large, complex datasets and provide actionable insights. Strong understanding of data modeling concepts, data governance, and quality frameworks. Experience with automation frameworks and CI/CD for data validation is a plus. Preferred Qualifications Experience with Airflow , dbt , or other data orchestration tools . Familiarity with data cataloging tools (e.g., AWS Glue Data Catalog). Prior experience in a product or SaaS-based company with high data volume environments. Why Join Us? Opportunity to work with cutting-edge data stack in a fast-paced environment. Collaborate with passionate data professionals driving real business impact. Flexible work environment with a focus on learning and innovation. EAIGG Draup is a Member of the Ethical AI Governance Group As an AI-first company, Draup has been a champion of ethical and responsible AI since day one. Our models adhere to the strictest data standards and are routinely audited for bias. Ready to see results? Drive better decisions with unmatched, real-time data & agentic intelligence Thank you! Your submission has been received! Oops! Something went wrong while submitting the form. Strictly necessary (always active) Cookies required to enable basic website functionality. Cookies used to deliver advertising that is more relevant to you and your interests. Cookies allowing the website to remember choices you make (such as your user name, language, or the region you are in). Cookies helping understand how this website performs, how visitors interact with the site, and whether there may be technical issues.
Posted 1 week ago
5.0 - 10.0 years
6 - 10 Lacs
Jaipur
Work from Office
Roles & Responsibilities: Collaborate with cross functional agile project teams to ensure quality is driven from requirements definition through to delivery by identifying, documenting, and prioritizing software issues. Design and execute test cases and plans to verify software product functionality and performance. Increase effectiveness and accuracy of testing procedures and create automated test scripts. Understand Non-Functional Requirements (Performance & Load, response time, SLAs for Application/System Performance/Availability). Work closely with Business Analysis team for reviewing Non-Functional Requirements and provide necessary feedback Design Load Models and Performance Test Scenarios as per the Test Strategy to test Performance Requirements. Identify Test Data needs Build Performance Test scripts & Performance Test Scenarios using Performance Testing tool, HP Load Runner, NeoLoad, Jmeter, WebLoad. Identifying Measurement points and Monitors. Create Performance Test Report covering current Performance levels, Bottlenecks and recommendations for Performance Optimization/ Improvements Be a creative thinker who can quickly identify and test for functional edge cases outside of expected functionality workflow Have the drive to become an expert in: o Unit testing o UX Testing o UI Testing o Integration testing of APIs o Performance and scalability testing o Security Penetration Testing Experience: 5+ years of experience Qualifications: Bachelor s degree, preferably B.E / B. Tech Technical Skills: Strong background in, and at least 3+ years of working in QA automation Experience in programming languages like Java, Python, or C++ Experience in writing, executing and monitoring automated test suites using a variety of technologies including, but not limited to, Cucumber, Concordion, Selenium, Fit/FitNesse, SoapUI At least 2 years of experience in any of the Load & Performance tools for scripting, execution and analysis is a must Microfocus LoadRunner, Apache JMeter, Neoload, Gatling Proficient working with relational databases such as MySQL & PostreSQL Skills and Knowledge: QA Automation, Load & Performance Tool, SQL, Security Testing, Java, Python OR C++ Apply We Are Looking For Ambitious, Experienced Developers! Glassdoor Have a referral? X
Posted 1 week ago
5.0 - 10.0 years
11 - 13 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building and optimizing data pipelines and data architecture, as well as experience with Azure cloud services. You will work closely with cross-functional teams to ensure data is accessible, reliable, and ready for analytics and business insights. Mandatory Skills Advanced SQL, Python and PySpark for data engineering Azure 1st party services (ADF, Azure Databricks, Synapse, etc.) Data warehousing (Redshift, Snowflake, Big Query) Workflow orchestration tools (Airflow, Prefect, or similar) Experience with DBT (Data Build Tool) for transforming data in the warehouse Hands-on experience with real-time/live data processing frameworks such as Apache Kafka, Apache Flink, or Azure Event Hubs Key Responsibilities Design, develop, and maintain scalable and reliable data pipelines Demonstrate experience and leadership across two full project cycles using Azure Data Factory, Azure Databricks, and PySpark Collaborate with data analysts, scientists, and software engineers to understand data needs Design and build scalable data pipelines using batch and real-time streaming architectures Implement DBT models to transform, test, and document data pipelines Implement data quality checks and monitoring systems Optimize data delivery and processing across a wide range of sources and formats Ensure security and governance policies are followed in all data handling processes Evaluate and recommend tools and technologies to improve data engineering capabilitie Lead and mentor junior data engineers as needed Work with cross-functional teams in a dynamic and fast-paced environment Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or related field Certifications in Databricks Professional are preferred Technical Skills Programming: Python, PySpark, SQL ETL tools and orchestration (e.g., Airflow, DBT), Cloud platforms (Azure) Real-time streaming tools: Kafka, Flink, Spark Streaming, Azure Event Hubs Data Warehousing: Snowflake, Big Query, Redshift Cloud: Azure (ADF, Azure Databricks) Orchestration: Apache Airflow, Prefect, Luigi Databases: PostgreSQL, MySQL, NoSQL (MongoDB, Cassandra) Tools: Git, Docker, Kubernetes (basic), CI/CD Soft Skills Strong problem-solving and analytical thinking Excellent verbal and written communication Ability to manage multiple tasks and deadlines Collaborative mindset with a proactive attitude Strong analytical skills related to working with unstructured datasets Good to Have Experience with real-time data processing (Kafka, Flink) Knowledge of data governance and privacy regulations (GDPR, HIPAA) Familiarity with ML model data pipeline integration Work Experience Minimum 5 years of relevant experience in data engineering roles Experience with Azure 1st party services across at least two full project lifecycles Compensation & Benefits Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Optional retirement savings plans and tax savings plans. Key Result Areas (KRAs) Timely development and delivery of high-quality data pipelines Implementation of scalable data architectures Collaboration with cross-functional teams for data initiatives Compliance with data security and governance standards Key Performance Indicators (KPIs) Uptime and performance of data pipelines Reduction in data processing time Number of critical bugs post-deployment Stakeholder satisfaction scores Successful data integrations and migrations
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Requirements 5+ years of experience in Linux Administration and troubleshooting. Certification (RHCE and AWS). Experience with AWS in an enterprise environment. Experience in Red Hat Satellite Server Administration. Experience with automation, including deployments and configuration management. OS and Hardware firmware upgrades. Must have experience in scripting such as Bash, Perl, Python, and Ansible. Installing and configuring of High Availability environment. Fine-tuning the system for optimal performance. Good Knowledge in Linux OS (Redhat, Oracle Linux, Cent OS, Ubuntu, OpenSuse, Debian) along with patching and Rollback. Good Knowledge of Web servers like Apache, Apache Tomcat, and NGINX. Good Knowledge of Firewalls (IPTables) and SELinux. Good Knowledge in installing and troubleshooting LDAP, DNS, DHCP, LVMs, and NFS. Good Knowledge of user administration, file systems, and Logs. Software package installation experience (deb, rpm, etc. ) Knowledgeable in Linux internals to debug and solve installation issues - environment, drivers, libraries, etc. Knowledge of databases like Oracle, MySQL, Postgres, etc. Set up and Configure Red Hat Satellite and Kickstart Servers for Red Hat Enterprise Linux 7/8/9 installs and RHN Push Updates. Good To Have Experience with Logic Monitor, Opsramp, Appd. Experience with Docker. Source code control (GitHub). Experience on AIX, Solaris. This job was posted by Shailendra Singh from PearlShell Softech.
Posted 1 week ago
3.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Spark and data warehousing solutions. - Strong understanding of data modeling and database design principles. - Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. - Experience in programming languages such as Python or Scala for data processing. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bengaluru office. - A 15 years full time education is required., 15 years full time education
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France