Home
Jobs

4841 Apache Jobs - Page 32

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Roles & Responsibilities - Experience to understand the organization's server & storage needs. verifying the compatibility of the server hardware with the organization's existing infrastructure, including network switches, storage systems, and other components. - Experience for physical connectivity by using various protocol. · Operating System Installation: Selecting and installing the appropriate operating system (e.g., Windows Server, Linux distribution) based on the server's intended purpose and compatibility requirements. This involves creating partitions, Raid configuration, Services understanding etc. · Network Configuration: Configuring network settings such as IP addresses, subnet masks, gateways, DNS servers, and network interfaces. This ensures the server can communicate with other devices. · Virtualization Technology: Understand & Evaluating the server's compatibility with virtualization VMware or Hyper-V platforms and assess its ability to handle virtualized workloads efficiently. · Security Settings: Implementing security measures to protect the server from unauthorized access and potential threats. This includes configuring firewalls, setting up access controls and user permissions, enabling encryption protocols, and establishing secure remote access methods (e.g., SSH, VPN). · Server Roles and Features: Enable specific server roles and features based on the intended purpose of the server. For example, configuring a web server role (e.g., IIS, Apache) for hosting websites, setting up a database server (e.g., MySQL, Microsoft SQL Server), or enabling file and print sharing capabilities. · Storage Configuration: Configuring storage settings, including disk partitioning, file systems (e.g., NTFS, ext4), and storage technologies (e.g., RAID configurations) to ensure optimal disk performance, data redundancy, and storage capacity utilization. · Application and Service Installation: Installing and configuring required applications and services on the server to support the organization's specific needs. This may include database management systems, web applications, email servers, and other software components. · Backup and Disaster Recovery: Implementing backup and disaster recovery strategies to protect critical data and ensure business continuity. This includes setting up regular backups, configuring backup schedules, and testing restore procedures to verify data recoverability. · Documentation and Change Management: Maintaining comprehensive documentation of the server configuration, including hardware specifications, software versions, network diagrams, and configuration details. Following change management processes to document any modifications made to the server configuration over time. · Active Directory (AD) configuration : Understand & Experience the key elements of AD configuration including (1) Forest and Domain Design (2) Domain Controller Installation (3) Active Directory Schema Extension (4) User and Group Management (5) Organizational Unit (OU) Configuration (6) Group Policy Configuration (7) Trust Relationships (8) DNS Integration (9) Replication Configuration (10) Security and Permissions · AD migration: Able to understand migration requirements & Perform Key steps in AD involvement Including (1) Planning and Analysis (2) Designing the Target Environment (3) Establishing the Migration Plan (4) Setting Up the Target Environment (5) User and Group Migration (6) Computer and Server Migration (7) Data and Application Migration (8) Testing and Validation (9) Decommissioning the Source Environment (10) Documentation and Post-Migration Tasks Education Qualification: Diploma or Degree (CE, IT, EC) , MCITP/MCSA/MCSE certified, Red hat Certified, 3-5 years’ experience. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

CSQ326R35 The Machine Learning (ML) Practice team is a highly specialized customer-facing ML team at Databricks facing an increasing demand for Large Language Model (LLM)-based solutions. We deliver professional services engagements to help our customers build, scale, and optimize ML pipelines, as well as put those pipelines into production. We work cross-functionally to shape long-term strategic priorities and initiatives alongside engineering, product, and developer relations, as well as support internal subject matter expert (SME) teams. We view our team as an ensemble: we look for individuals with strong, unique specializations to improve the overall strength of the team. This team is the right fit for you if you love working with customers, teammates, and fueling your curiosity for the latest trends in LLMs, MLOps, and ML more broadly. This role can be remote. The Impact You Will Have Develop LLM solutions on customer data, such as RAG architectures on enterprise knowledge repos, querying structured data with natural language, and content generation Help customers solve tough problems across industries like Health and Life Sciences, Finance, Retail, Startups, and many others Build, scale, and optimize customer data science workloads across industries and apply best-in-class MLOps to productionize these workloads Advise data teams on data science architecture, tooling, and best practices Provide thought leadership by presenting at conferences such as Data+AI Summit and mentoring the larger ML SME community in Databricks Collaborate cross-functionally with the product and engineering teams to define priorities and influence the product roadmap What We Look For Experience in building Generative AI applications, including RAG, agents, Text2SQL, fine-tuning, and deploying LLMs, using tools such as HuggingFace, Langchain, and OpenAI 4-10 years of hands-on industry data science experience, leveraging typical machine learning and data science tools including pandas, MLflow, scikit-learn, and PyTorch Experience in building production-grade ML or GenAI deployments on AWS, Azure, or GCP. Graduate degree in a quantitative discipline (Computer Science, Engineering, Statistics, Operations Research, etc.) or equivalent practical experience Experience in communicating and teaching technical concepts to both non-technical and technical audiences Passion for collaboration, life-long learning, and driving business value through ML [Preferred] Experience working with Databricks and Apache Spark™ About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Job Title: Senior Data Engineer – Data Quality, Ingestion & API Development Mandatory skill set - Python, Pyspark, AWS, Glue , Lambda, CI CD Total experience - 8+ Relevant experience - 8+ Work Location - Trivandrum /Kochi Candidates from Kerala and Tamil Nadu prefer more who are ready to relocate to above work locations. Candidates must be having an experience in lead role related to Data Engineer Job Overview We are seeking an experienced Senior Data Engineer to lead the development of a scalable data ingestion framework while ensuring high data quality and validation. The successful candidate will also be responsible for designing and implementing robust APIs for seamless data integration. This role is ideal for someone with deep expertise in building and managing big data pipelines using modern AWS-based technologies, and who is passionate about driving quality and efficiency in data processing systems. Key Responsibilities • Data Ingestion Framework: o Design & Development: Architect, develop, and maintain an end-to-end data ingestion framework that efficiently extracts, transforms, and loads data from diverse sources. o Framework Optimization: Use AWS services such as AWS Glue, Lambda, EMR, ECS , EC2 and Step Functions to build highly scalable, resilient, and automated data pipelines. • Data Quality & Validation: o Validation Processes: Develop and implement automated data quality checks, validation routines, and error-handling mechanisms to ensure the accuracy and integrity of incoming data. o Monitoring & Reporting: Establish comprehensive monitoring, logging, and alerting systems to proactively identify and resolve data quality issues. • API Development: o Design & Implementation: Architect and develop secure, high-performance APIs to enable seamless integration of data services with external applications and internal systems. o Documentation & Best Practices: Create thorough API documentation and establish standards for API security, versioning, and performance optimization. • Collaboration & Agile Practices: o Cross-Functional Communication: Work closely with business stakeholders, data scientists, and operations teams to understand requirements and translate them into technical solutions. o Agile Development: Participate in sprint planning, code reviews, and agile ceremonies, while contributing to continuous improvement initiatives and CI/CD pipeline development (using tools like GitLab). Required Qualifications • Experience & Technical Skills: o Professional Background: At least 5 years of relevant experience in data engineering with a strong emphasis on analytical platform development. o Programming Skills: Proficiency in Python and/or PySpark, SQL for developing ETL processes and handling large-scale data manipulation. o AWS Expertise: Extensive experience using AWS services including AWS Glue, Lambda, Step Functions, and S3 to build and manage data ingestion frameworks. o Data Platforms: Familiarity with big data systems (e.g., AWS EMR, Apache Spark, Apache Iceberg) and databases like DynamoDB, Aurora, Postgres, or Redshift. o API Development: Proven experience in designing and implementing RESTful APIs and integrating them with external and internal systems. o CI/CD & Agile: Hands-on experience with CI/CD pipelines (preferably with GitLab) and Agile development methodologies. • Soft Skills: o Strong problem-solving abilities and attention to detail. o Excellent communication and interpersonal skills with the ability to work independently and collaboratively. o Capacity to quickly learn and adapt to new technologies and evolving business requirements. Preferred Qualifications • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. • Experience with additional AWS services such as Kinesis, Firehose, and SQS. • Familiarity with data lakehouse architectures and modern data quality frameworks. • Prior experience in a role that required proactive data quality management and API- driven integrations in complex, multi-cluster environments. Candidate those who are Interested please drop your resume to: gigin.raj@greenbayit.com MOB NO - 8943011666 Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Experience - 5+ yrs Location - Bangalore, Hyderabad, Chennai, Trivandrum , Cochin. Skills - Python,Gcp,Bigquery,Java Notice period - Immediate - 30 Days. We are looking for skilled Data Engineers to join our team, specializing in data migration, integration, and pipeline development using Google Cloud Platform (GCP) and BigQuery . Key Responsibilities: Migrate data from SQL Server and on-prem databases to Google BigQuery Develop and optimize ETL pipelines using Apache Airflow, Python, or Spark Analyze and refactor existing SSIS packages for GCP compatibility Integrate data from diverse sources including APIs and external databases Write complex SQL queries , develop views, and optimize stored procedures in BigQuery Required Skills: Strong experience with Python , GCP , BigQuery , Java , SQL , and Spark Hands-on experience in data warehousing , data migration , and pipeline development Show more Show less

Posted 5 days ago

Apply

7.5 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Weblogic Application Server Administration Good to have skills : Linux, Apache Tomcat Administration Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are functioning optimally. You will engage in problem-solving activities, contribute to key decisions, and manage the application development lifecycle to deliver high-quality results that align with business objectives. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor application performance and implement improvements as necessary. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Weblogic Application Server Administration. - Good To Have Skills: Experience with Linux and Apache Tomcat Administration, - Strong understanding of application deployment and configuration management. - Experience with troubleshooting and resolving application issues. - Familiarity with performance tuning and optimization techniques. Additional Information: - The candidate should have minimum 7.5 years of experience in Oracle Weblogic Application Server Administration. - This position is based at our Hyderabad office. - A 15 years full time education is required. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Join us as a Lead Engineer This is an opportunity for a driven Lead Engineer to join us and support the technical delivery of a software engineering team You’ll be responsible for developing solution design options and explaining the pros and cons to key stakeholders for appropriate decision making Hone your existing technical skills and advance your career in this innovative and challenging role We're offering this role at associate vice president level What you'll do In this role, you’ll support a team of developers and set the technical direction of the deliveries, applying the principles and methodologies of software engineering to the technical design, development, testing, and maintenance of applications and services. We’ll look to you to oversee the work quality of the software engineering team, making sure that it meets the technical standards for all services output, as well as implementing a culture of concise and comprehensive technical documentation as a continuous process. Day-to-day, You’ll Be Supporting and monitoring the technical progress against plans, while safeguarding functionality, scalability and performance and providing updates to stakeholders Supporting and mentoring the team in the understanding of relevant software languages and technical domains Driving the adoption of software engineering principles, processes and best practices Liaising with engineers, architects, business analysts and other key stakeholders to understand the objectives, requirements and options Designing and developing high-volume, high-performance and high-availability applications using proven frameworks and technologies The skills you'll need To be successful in this role, you’ll need at least eight years of experience in software engineering, software design or database design and architecture, as well as experience in providing technical leadership and accountability for a software engineering team. We’ll also look to you to have experience of test-driven development and the use of automated test frameworks, mocking, stubbing and unit testing tools, along with knowledge of the key phases of the software delivery lifecycle and established software development methodologies. You’ll Also Demonstrate The ability to develop software in an SOA or micro-services paradigm Development experience in Java, Spring Boot and Restful APIs Strong understanding of Microservices architecture and design patterns Knowledge of Apache Kafka and event-driven architecture Experience of working in an environment where products must be delivered to specific timescales Show more Show less

Posted 5 days ago

Apply

9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

📈 Experience: 9+ Years 📍 Location: Pune 📢 Immediate to 15 days and are highly encouraged to apply! 🔧 Primary Skills: Data Engineer, Lead, Architect, Python, SQL, Apache Airflow, Apache Spark, AWS (S3, Lambda, Glue) Job Overview We are seeking a highly skilled Data Architect / Data Engineering Lead with over 9 years of experience to drive the architecture and execution of large-scale, cloud-native data solutions. This role demands deep expertise in Python, SQL, Apache Spark, Apache Airflow , and extensive hands-on experience with AWS services. You will lead a team of engineers, design robust data platforms, and ensure scalable, secure, and high-performance data pipelines in a cloud-first environment. Key Responsibilities Data Architecture & Strategy Architect end-to-end data platforms on AWS using services such as S3, Redshift, Glue, EMR, Athena, Lambda, and Step Functions. Design scalable, secure, and reliable data pipelines and storage solutions. Establish data modeling standards, metadata practices, and data governance frameworks. Leadership & Collaboration Lead, mentor, and grow a team of data engineers, ensuring delivery of high-quality, well-documented code. Collaborate with stakeholders across engineering, analytics, and product to align data initiatives with business objectives. Champion best practices in data engineering, including reusability, scalability, and observability. Pipeline & Platform Development Develop and maintain scalable ETL/ELT pipelines using Apache Airflow , Apache Spark , and AWS Glue . Write high-performance data processing code using Python and SQL . Manage data workflows and orchestrate complex dependencies using Airflow and AWS Step Functions. Monitoring, Security & Optimization Ensure data reliability, accuracy, and security across all platforms. Implement monitoring, logging, and alerting for data pipelines using AWS-native and third-party tools. Optimize cost, performance, and scalability of data solutions on AWS. Required Qualifications 9+ years of experience in data engineering or related fields, with at least 2 years in a lead or architect role. Proven experience with: Python and SQL for large-scale data processing. Apache Spark for batch and streaming data. Apache Airflow for workflow orchestration. AWS Cloud Services , including but not limited to: S3, Redshift, EMR, Glue, Athena, Lambda, IAM, CloudWatch Strong understanding of data modeling, distributed systems, and modern data architecture patterns. Excellent leadership, communication, and stakeholder management skills. Preferred Qualifications Experience implementing data platforms using AWS Lakehouse architecture. Familiarity with Docker , Kubernetes , or similar container/orchestration systems. Knowledge of CI/CD and DevOps practices for data engineering. Understanding of data privacy and compliance standards (GDPR, HIPAA, etc.). Show more Show less

Posted 5 days ago

Apply

1.0 - 3.0 years

0 Lacs

Rajkot, Gujarat, India

On-site

Linkedin logo

Company Description Infilytics is an AI Powered Analytics Automation platform. The company's mission is to empower individuals of all technical skill levels to make faster, more insightful, and confident decisions through analytics. With a user-friendly interface and a no-code platform, Infilytics makes it easy for everyone to transition from data to decisions. Role Description This is a full-time Back End Developer role at Infilytics. The Back End Developer will be responsible for software and web development, utilizing Object-Oriented Programming (OOP) principles. Location This is an on-site position located in Rajkot. Responsibilities Design, develop, and maintain backend services using Java and Spring Boot Build and integrate RESTful APIs Work with databases using Hibernate, JDBC, and write optimized SQL queries Process and analyze large datasets using Apache Spark Collaborate with front-end developers and other team members to deliver high-quality software Maintain code quality through version control and best practices Eligibility 1 - 3 Years’ experience in core Java application development and Rest frameworks. Bachelor’s degree in Computer Science, Information Technology, or a similar field. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Job description About the Company myUpchar is India’s largest health-tech company with the vision to empower every individual with accessible and affordable healthcare through innovative technology, comprehensive medical information, teleconsultations and high-quality medicine, ensuring a healthier and more informed society. myUpchar is founded by alumni from Stanford University with rich experiences at Amazon, and BCG among other leading global firms. myUpchar has been funded by top VCs and Angel investors in India including Omidyar Network, Nexus VP, and Rajan Anandan. Currently, the platform receives around 50 million visits monthly and provides a positive health outcome to over 100K patients every month through our science-backed result-oriented treatment approach. We are looking for an experienced and motivated Software Developer with a strong focus on Ruby on Rails (ROR) to join our dynamic development team. The ideal candidate will have a solid understanding of building web applications using the Ruby on Rails framework and will be responsible for designing, developing, and maintaining scalable and high-performance web applications. The role requires expertise in front-end and back-end technologies, with a focus on Ruby on Rails, MySQL, Redis, Elasticsearch, and other modern technologies. Position : Software Engineer - Ruby on Rails (ROR) Location : Okhla Phase 3, South Delhi Experience : 1yrs-3yrs Employment Type : Full Time Mandatory skills: Ruby on Rails AGILE Methodology jQuery, AJAX, MYSQL 5.x, Apache Experience in Git, SVN and Deployment Rails specific server administration Should have good knowledge of latest trends and developments in Ruby and Rails community Roles & Responsibilities: Independently execute the project (requirements gathering, analysis, technical design, development, testing, deployment) Coordinate with the stakeholders for the on-going process of technical discussions, status updates etc. Provide technical assistance and implementation for interfacing with 3rd party APIs Evaluate and add open source components based on project needs Mentor and assist other developers Should have developed at least one complex application using these tech stacks. Nice to have skills: NoSQL databases like MongoDB Have worked on ElasticSearch Contribution to the community in the form of gems, plugins or technical articles/publications. Desired Candidate Profile: Experience: 1yrs-3yrs Good Communication and Interpersonal skill Show more Show less

Posted 5 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title: AWS Data Engineer Location : Pan India Experience : 8-6 Years Job Typ e: Contract to Hire Notice Period : Immediate Joiners Mandatory Skills:, AWS services s3, Lambda, Redshift, Glue,Python,PySpark,SQL Job description: JD: Description - External At Storable, were on a mission to power the future of storage. Our innovative platform helps businesses manage, track, and grow their self-storage operations, and were looking for a Data Manager to join our data-driven team. Storable is committed to leveraging cutting-edge technologies to improve the efficiency, accessibility, and insights derived from data, empowering our team to make smarter decisions and foster impactful growth. As a Data Manager, you will play a pivotal role in overseeing and shaping our data operations, ensuring that our data is organized, accessible, and effectively managed across the organization. You will lead a talented team, work closely with cross-functional teams, and drive the development of strategies to enhance data quality, availability, and security. Key Responsibilities: Lead Data Management Strategy Define and execute the data management vision, strategy, and best practices, ensuring alignment with Storables business goals and objectives. Oversee Data Pipelines: Design, implement, and maintain scalable data pipelines using industry-standard tools to efficiently process and manage large-scale datasets. Ensure Data Quality & Governance, Implement data governance policies and frameworks to ensure data accuracy, consistency, and compliance across the organization. Manage Cross-Functional Collaboration - Partner with engineering, product, and business teams to make data accessible and actionable, and ensure it drives informed decision-making. Optimize Data Infrastructure: Leverage modern data tools and platforms. AWS, Apache Airflow, Apache Iceberg to create an efficient, reliable, and scalable data infrastructure. Monitor & Improve Performance: Mentorship & Leadership Lead and develop a team of data engineers and analysts, fostering a collaborative environment where innovation and continuous improvement are valued Qualifications Proven Expertise in Data Management: Significant experience in managing data infrastructure, data governance, and optimizing data pipelines at scale. Technical Proficiency : Strong hands-on experience with data tools and platforms such as Apache Airflow, Apache Iceberg, and AWS services s3, Lambda, Redshift, Glue Data Pipeline Mastery Familiarity with designing, implementing, and optimizing data pipelines and workflows in Python or other languages for data processing Experience with Data Governance: Solid understanding of data privacy, quality control, and governance best practice Leadership Skills: Ability to lead and mentor teams, influence stakeholders, and drive data initiatives across the organization. Analytical Mindset: Strong problem-solving abilities and a data-driven approach to improving business operations. Excellent Communication: Ability to communicate complex data concepts to both technical and non-technical stakeholders effectively. Bonus Points : Experience with visualization tools Looker, Tableau and reporting frameworks to provide actionable insights. Show more Show less

Posted 5 days ago

Apply

3.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Software Development Engineers are experienced professionals that design, develop, test, deploy, maintain, and enhance software solutions. They have in-depth knowledge and subject matter expertise in software development. Sr. Software Development Engineers interact with internal and external teams to train them on the products, work on projects independently and collaborate with cross-functional teams to manage project priorities, deadlines, and deliverables. In this role, you will mentor and guide others by reviewing the code of more junior software engineers as well as encourage others to grow their technical skillset. Sr. Software Development Engineers are creative problem solvers and are involved in continuously driving improvements across the software development life cycle as well as ensure best practices are utilized. About The Role: In this role as Software Engineer, you will: Designs, develops and tests software systems and/or applications for enhancements and new products Writes code according to coding specifications established for software solutions. Delivers software features with exceptional quality, meeting designated release plans and delivery commitments. Develops software solutions by studying information needs, conferring with users, studying systems flow, data usage, and work processes; investigating problem areas; and following the software development lifecycle. Prepares and installs solutions by determining and designing system specifications, standards, and programming. Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions. Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments, and clear code. Improves operations by conducting systems analysis and recommending changes in policies and procedures. Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment, and by participating in educational opportunities, reading professional publications, maintaining personal networks, and participating in professional organizations. Protects operations by keeping information confidential. Provides information by collecting, analyzing, and summarizing development and service issues. Accomplishes engineering and organization mission by completing related results as needed. Collaborates with other designers and engineers Breaks down customer requirements/problems into for the team. Ability to clearly communicate technical concepts to stakeholders About You: You are a fit for this position if your background includes: 3 to 6 years of experience in software development. Bachelor's degree in systems Engineering or similar. Proficient in Java/ JavaScript / Angular. Experience with REST APIs and microservices. Strong problem solving and analytical thinking. Good written and verbal communication skills. Required Skills: Amazon Web Services (AWS); Fiddler Web Debugger (Inactive); Git; GitHub; Gradle; Hypertext Transfer Protocol (HTTP); JUnit Testing; JUnit Testing Framework; Mockito; Mockito Unit Test Framework; PostgreSQL; Postman (Platform); Postman (Software); REST Client; RESTful APIs; Spring MVC (Model View Controller); Spring Web MVC; Structured Query Language (SQL). Optional Skills: Apache Ant; Apache Ivy; Apache Tomcat; Azure Devops; Eclipse Development; Eclipse IDE; GitHub Copilot; Helm (Tool); IntelliJ IDEA IDE (Integrated Development Environment); JetBrains IntelliJ IDEA; Kubernetes; Microsoft Azure DevOps Boards; Microsoft Azure DevOps Pipelines. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About company: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title:GCP Data Architect Work Mode: Hybrid Loc: Pune Experience: 10+ years Job Type: Contract to hire Notice Period: - Immediate joiners. Mandatory Skills: GCP,SQL Detailed JD A Google Cloud Platform (GCP) Data Architect with experience in migrating of legacy platform to Cloud SQL (preferred) would typically have a robust set of skills and expertise, including: Key Skills & Expertise: 1. **Cloud SQL Expertise**: - Deep knowledge of **Google Cloud SQL**, which is a fully managed relational database service that supports databases like MySQL, PostgreSQL, and SQL Server. - Experience in designing scalable, highly available, and secure database solutions on Cloud SQL. 2. **Migration Strategies**: - Strong experience in **data migration** from mainframe databases (like DB2 or IMS) to modern cloud-based relational databases. - Knowledge of **ETL tools** (e.g., **Google Cloud Dataflow**, **Apache Beam**) for extracting, transforming, and loading mainframe data to Cloud SQL. - Familiarity with **Database Migration Service** (DMS) in GCP for automated database migrations from legacy systems to Cloud SQL. 3. **Data Modeling**: - Ability to translate mainframe data structures (which may use COBOL or other legacy formats) to a relational schema that fits Cloud SQL’s SQL-based architecture. - Expertise in normalizing and optimizing data models for Cloud SQL environments. (must have) - Represent and seek approval of the Data Model in HSBC Data Architecture forum - Create and maintain the Data Dictionary 4. **Data Integration and Transformation**: - Proficiency in integrating data from different sources, ensuring data consistency and accuracy during migration. - Use of **Google Cloud Storage**, **BigQuery**, or other tools for intermediate data storage and analysis during migration. 5. **Cloud Architecture and Design**: - Architecting and designing highly available, fault-tolerant cloud infrastructure for running Cloud SQL databases. - Ensuring that the design can scale horizontally or vertically as needed and optimizing for cost-efficiency. 6. **Performance Tuning and Optimization**: - Experience with performance tuning, query optimization, and configuring Cloud SQL to handle high-throughput workloads. - Monitoring tools such as **Google Cloud Operations** suite (formerly Stackdriver) for real-time performance tracking. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Job Role : General & Operations Managers For Workflow Annotation Specialist Project Type: Contract-based / Freelance / Part-time – 1 Month Job Overview: We are seeking domain experts to participate in a Workflow Annotation Project . The role involves documenting and annotating the step-by-step workflows of key tasks within the candidate’s area of expertise. The goal is to capture real-world processes in a structured format for AI training and process optimization purposes. Domain Expertise Required : Track projects, allocate resources, analyze operational data, adjust budgets, enforce policy, and produce performance reports; senior ops leads own forecasts and strategic resource shifts. Tools & Technologies You May have Worked: Commercial Software ‑ Asana, Trello, Jira, Monday.com, MS Project, QuickBooks, SAP, Oracle Financials, Slack, Zoom, Teams, Tableau, Power BI, Workday, BambooHR, ADP, NetSuite, SAP S/4HANA, Google Docs. Open / Free Software ‑ OpenProject, Taiga, Kanboard, Leantime, Wekan, LibreOffice Calc, Google Sheets (free), GnuCash, Odoo Community, ERPNext, Rocket.Chat, Jitsi Meet, Metabase, Apache Superset, Nextcloud. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location : Pune Mode of Work : Full-time, On-site Experience required : 3+ years Who You Are Highly skilled and detail-oriented Software Development Engineer in Test (SDET) with a passion for building robust, scalable, and efficient test automation solutions. Your Role Responsible for designing, developing, and executing comprehensive automation test strategies for microservices-based applications. You will play a critical role in maintaining our code quality and system reliability in CI/CD pipelines by owning both manual and automated quality assurance processes. Desired Technical Competencies & Skills Develop robust automation frameworks using Java and Python to test APIs and web services. Design test plans and write test cases for microservices using tools such as Selenium/Cucumber/Tester man/Karate. Integrate automated tests within CI/CD pipelines using Jenkins. Perform API testing (manual and automated) for RESTful services. Conduct performance testing using Apache JMeter. Collaborate closely with DevOps to validate applications in Dockerized and Kubernetes environments. Troubleshoot, log, and document defects and improvements across cloud hosted services (preferably AWS). What We Offer Leadership & Impact : Drive impactful projects in a dynamic environment. Growth & Learning : Continuous learning and career advancement opportunities. Recognition & Excellence : Acknowledgment for innovative contributions. Global Influence : Lead initiatives with global impact. Benefits Work-Life Harmony : Flexible schedules prioritizing well-being. Relocation Support, Global Opportunities : Seamless transitions and international exposure. Rewarding Performance : Performance-based bonuses and annual rewards. Comprehensive Well-being : Benefits including Provident Fund and health insurance. (ref:hirist.tech) Show more Show less

Posted 5 days ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Skills: Kafka, Apache Kafka, Kafka Streams, Kafka Connect, Shell Scripting, Linux Administration, Job Overview ALIQAN Technologies is seeking a skilled Kafka Admin for a mid-level contract position. The role is based in Bangalore, Chennai, or Pune. We are looking for candidates with 4 to 6 years of relevant work experience who can bring expertise in managing and optimizing Kafka ecosystems. Qualifications And Skills Must have a minimum of 4 years of working experience with Kafka (Mandatory skill). Proficiency with Apache Kafka for enterprise-grade applications (Mandatory skill). Expertise in Kafka Streams for real-time stream processing (Mandatory skill). Experience with Kafka Connect for data integration between Kafka and other systems. Proficient in shell scripting for managing and automating tasks in a Linux environment. A solid understanding of distributed systems and event-driven programming. Excellent problem-solving skills and the ability to troubleshoot issues quickly and effectively. Strong communication skills and capability to collaborate within cross-functional teams. Roles And Responsibilities Manage and administer Kafka clusters, ensuring their stability, performance, and security. Implement and manage Kafka stream processing solutions to meet business needs. Collaborate with development teams to integrate Kafka solutions with enterprise applications. Monitor and optimize Kafka performance, scalability, and reliability. Develop and maintain automation scripts to streamline Kafka operations and administrative tasks. Provide technical support and troubleshooting for Kafka-related issues. Ensure proper documentation of Kafka architecture, configurations, and operational procedures. Stay updated with the latest trends and advancements in Kafka technologies and propose improvements. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Title : AI/ML Engineer. Company : Cyfuture India Pvt.Ltd. Industry : IT Services and IT Consulting. Location : Sector 81, NSEZ, Noida (5 Days Work From Office). About Cyfuture Cyfuture is a trusted name in IT services and cloud infrastructure, offering state-of-the-art data center solutions and managed services across platforms like AWS, Azure, and VMWare. We are expanding rapidly in system integration and managed services, building strong alliances with global OEMs like VMWare, AWS, Azure, HP, Dell, Lenovo, and Palo Alto. Position Overview We are hiring an experienced AI/ML Engineer to lead and shape our AI/ML initiatives. The ideal candidate will have hands-on experience in machine learning and artificial intelligence, with strong leadership capabilities and a passion for delivering production-ready solutions. This role involves end-to-end ownership of AI/ML projects, from strategy development to deployment and optimization of large-scale systems. Key Responsibilities Lead and mentor a high-performing AI/ML team. Design and execute AI/ML strategies aligned with business goals. Collaborate with product and engineering teams to identify impactful AI opportunities. Build, train, fine-tune, and deploy ML models in production environments. Manage operations of LLMs and other AI models using modern cloud and MLOps tools. Implement scalable and automated ML pipelines (e., with Kubeflow or MLRun). Handle containerization and orchestration using Docker and Kubernetes. Optimize GPU/TPU resources for training and inference tasks. Develop efficient RAG pipelines with low latency and high retrieval accuracy. Automate CI/CD workflows for continuous integration and delivery of ML systems. Key Skills & Expertise Cloud Computing & Deployment : Proficiency in AWS, Google Cloud, or Azure for scalable model deployment. Familiarity with cloud-native services like AWS SageMaker, Google Vertex AI, or Azure ML. Expertise in Docker and Kubernetes for containerized deployments. Experience with Infrastructure as Code (IaC) using tools like Terraform or CloudFormation. Machine Learning & Deep Learning Strong command of frameworks : TensorFlow, PyTorch, Scikit-learn, XGBoost. Experience with MLOps tools for integration, monitoring, and automation. Expertise in pre-trained models, transfer learning, and designing custom architectures. Programming & Software Engineering Strong skills in Python (NumPy, Pandas, Matplotlib, SciPy) for ML development. Backend/API development with FastAPI, Flask, or Django. Database handling with SQL and NoSQL (PostgreSQL, MongoDB, BigQuery). Familiarity with CI/CD pipelines (GitHub Actions, Jenkins). Scalable AI Systems Proven ability to build AI-driven applications at scale. Handle large datasets, high-throughput requests, and real-time inference. Knowledge of distributed computing : Apache Spark, Dask, Ray. Model Monitoring & Optimization Hands-on with model compression, quantization, and pruning. A/B testing and performance tracking in production. Knowledge of model retraining pipelines for continuous learning. Resource Optimization Efficient use of compute resources : GPUs, TPUs, CPUs. Experience with serverless architectures to reduce cost. Auto-scaling and load balancing for high-traffic systems. Problem-Solving & Collaboration Translate complex ML models into user-friendly applications. Work effectively with data scientists, engineers, and product teams. Write clear technical documentation and architecture reports. (ref:hirist.tech) Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Role : Java Architect. Experience : 10+ years. Location : Hyderabad/Remote. We are looking for a highly skilled Java Architect to design, develop, and implement Java-based applications. The ideal candidate will have extensive experience in modern software architecture, design patterns, and development best practices. You will play a key role in ensuring our architecture is scalable, extensible, and aligned with business needs. Key Responsibilities Design the end-to-end architecture and development of scalable Java-based backend systems and microservices. Define and enforce coding standards and architectural best practices across the engineering team. Translate complex business requirements into effective technical designs and comprehensive solutions. Collaborate closely with product managers, QA teams, and stakeholders to align technology initiatives with business goals. Drive performance tuning, application security, and cloud modernization efforts. Mentor and coach junior engineers, fostering a culture of continuous improvement and engineering excellence. Manage project execution, including planning, risk assessment, and timely delivery. Conduct in-depth code reviews and offer constructive feedback to maintain high-quality codebases. Maintain clear, detailed technical documentation to support ongoing development and knowledge sharing. Participate actively in all phases of the software development lifecycle (SDLC). Identify performance bottlenecks and implement effective optimization strategies. Design and implement microservices architecture using Spring Boot and related frameworks. Integrate third-party APIs and services to extend application functionality. Support and enhance CI/CD pipelines to ensure efficient and reliable deployments. Stay current with emerging technologies and recommend upgrades, tools, and frameworks as needed. Ensure secure coding practices and compliance with organizational and regulatory standards. Communicate technical concepts clearly to both technical and non-technical stakeholders. Required Skills And Qualifications. 9+ years of hands-on experience in Java development and enterprise software architecture. Deep expertise in Java, Spring Boot, Spring Framework, and Hibernate/JPA. Strong grasp of object-oriented programming (OOP), design patterns, and clean architecture principles. Proven experience designing, building, and scaling microservices-based architectures. Advanced SQL skills with extensive experience using Oracle Database. Exposure to Apache Kafka and event-driven architectures or Message broker systems. Proficient in RESTful API design, working with JSON, and integrating external services and APIs. Experienced with CI/CD pipelines, version control (Git), and build automation tools such as Jenkins. Skilled in Docker for containerization and Kubernetes for orchestration and deployment. Strong focus on application security, secure coding standards, and compliance. Excellent analytical and problem-solving skills, with great attention to detail. Agile/Scrum practitioner, comfortable working in cross-functional teams or independently. Proven track record in mentoring junior developers and conducting in-depth code reviews. Experience in performance tuning, system profiling, and optimizing enterprise applications. Nice to Have -. Familiarity with frontend or full-stack ecosystems like React.js, Angular, or Node.js. Understanding of on-Premises environment. Exposure to cloud platforms such as AWS or Azure, including services like EC2, Lambda, and S3. CloudWatch. Understanding DevOps practices, Infrastructure as Code (IaC), and monitoring/observability tools. Experience with JUnit and other test automation frameworks for unit and integration testing. (ref:hirist.tech) Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

GAQ326R190 Mission As the Staff People Business Partner for India, you will have the unique opportunity to drive meaningful impact across our largest and most dynamic region in APJ. In this pivotal role, you will navigate complex organizational challenges by partnering closely with leaders across India-based teams, acting as a strategic thought partner, consultant, and champion for talent strategy and people initiatives. You will serve as a trusted advisor on all aspects of organizational effectiveness—including organizational planning and design, performance management, career development, leadership coaching, employee relations, and compensation. Your expertise will help build scalable, progressive, and high-performing organizations. In close collaboration with your People Partner leader, you will embody and advocate for our company’s principles, values, and policies, fostering a global, inclusive, and high-performance work environment that empowers every employee to thrive. Outcomes Serve as a trusted advisor to India senior leadership and global leaders with India-based teams, delivering impactful solutions that benefit both the business and employees while enabling scalable growth. Facilitate and manage core people programs, policies, and procedures for the India team—including, but not limited to, performance management, culture surveys, talent management, career development, compensation, benefits and rewards, development programs, and change management. Design and implement effective change management strategies and learning programs to promote organizational health. Leverage data and insights to develop and align talent strategies that directly support business objectives and drive organizational success. Lead the execution of key organizational initiatives and goals by applying effective planning and project management methodologies, ensuring alignment with overall business objectives. Deliver on initiatives and goals through thoughtful organizational planning and project management Act as the primary point of contact between business units and central People Operations, Benefits, Payroll, and other cross-functional teams. Clearly communicate business-specific people priorities and advocate for integrating these needs into centralized programs and policies. Provide expert support and consultation across the People team, fostering collaboration and driving cross-functional initiatives aimed at organizational improvement. Partner with the Employee Relations team to address and resolve employee relations matters, including participating in investigations, managing disciplinary actions, and facilitating performance management discussions. Contribute to or support APJ initiatives as needed Competencies 5+ yrs of HR experience that shows proven success as a strategic partner working with managers up through the VP+ level Proactive, resilient, and able to thrive in a fast-paced, evolving environment. In-depth knowledge of Human Resources practices and legal requirements in India Strong organizational skills and detail orientation Highly adaptable; drives change and influences leaders during rapid growth, especially those new to local norms Strong verbal and written communicator; effectively interprets and conveys ideas, information, instructions, policies and procedures Strong judgment in decision-making and problem-solving in ambiguous situations Skilled in data analysis to generate actionable insights Strong sense of urgency with the ability to handle multiple competing priorities Excellent computer skills, including proficiency in Google Workspace and Microsoft Office Suite About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone. Show more Show less

Posted 5 days ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Skills: Kafka, Apache Kafka, Kafka Streams, Kafka Connect, Shell Scripting, Linux Administration, Job Overview ALIQAN Technologies is seeking a skilled Kafka Admin for a mid-level contract position. The role is based in Bangalore, Chennai, or Pune. We are looking for candidates with 4 to 6 years of relevant work experience who can bring expertise in managing and optimizing Kafka ecosystems. Qualifications And Skills Must have a minimum of 4 years of working experience with Kafka (Mandatory skill). Proficiency with Apache Kafka for enterprise-grade applications (Mandatory skill). Expertise in Kafka Streams for real-time stream processing (Mandatory skill). Experience with Kafka Connect for data integration between Kafka and other systems. Proficient in shell scripting for managing and automating tasks in a Linux environment. A solid understanding of distributed systems and event-driven programming. Excellent problem-solving skills and the ability to troubleshoot issues quickly and effectively. Strong communication skills and capability to collaborate within cross-functional teams. Roles And Responsibilities Manage and administer Kafka clusters, ensuring their stability, performance, and security. Implement and manage Kafka stream processing solutions to meet business needs. Collaborate with development teams to integrate Kafka solutions with enterprise applications. Monitor and optimize Kafka performance, scalability, and reliability. Develop and maintain automation scripts to streamline Kafka operations and administrative tasks. Provide technical support and troubleshooting for Kafka-related issues. Ensure proper documentation of Kafka architecture, configurations, and operational procedures. Stay updated with the latest trends and advancements in Kafka technologies and propose improvements. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Step into role of a Senior Data Engineer. At Barclays, innovation isn’t encouraged, its expected. As a Senior Data Engineer you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success. Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks. Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies. Hands on programming experience in python and PY-Spark. Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology. Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications. Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL. Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Join Barclays as a Senior Data Engineer. At Barclays, we are building the bank of tomorrow. As a Strategy and Transformation Lead you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success. Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks. Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies. Hands on programming experience in python and PY-Spark. Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology. Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications. Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL. Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures.. If managing a team, they define jobs and responsibilities, planning for the department’s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements.. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others.. OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions.. Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description This position participates in the support of batch and real-time data pipelines utilizing various data analytics processing frameworks in support of data science practices for Marketing and Finance business units. This position supports the integration of data from various data sources, as well as performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment. This position performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position participates and contributes to synthesizing disparate data sources to support reusable and reproducible data assets. Responsibilities Supervises and supports data engineering projects and builds solutions by leveraging a strong foundational knowledge in software/application development. Develops and delivers data engineering documentation. Gathers requirements, defines the scope, and performs the integration of data for data engineering projects. Recommends analytic reporting products/tools and supports the adoption of emerging technology. Performs data engineering maintenance and support. Provides the implementation strategy and executes backup, recovery, and technology solutions to perform analysis. Performs ETL tool capabilities with the ability to pull data from various sources and perform a load of the transformed data into a database or business intelligence platform. Required Qualifications Codes using programming language used for statistical analysis and modeling such as Python/Java/Scala/C# Strong understanding of database systems and data warehousing solutions. Strong understanding of the data interconnections between organizations’ operational and business functions. Strong understanding of the data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility Strong understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, Real-time predictions, insights and customer feedback, data security, data regulations, and compliance. Strong knowledge of data structures, as well as data filtering and data optimization. Strong understanding of analytic reporting technologies and environments (e.g., PBI, Looker, Qlik, etc.) Strong understanding of a cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages. Azure Preferred. Understanding of distributed systems and the underlying business problem being addressed, as well as guides team members on how their work will assist by performing data analysis and presenting findings to the stakeholders. Bachelor’s degree in MIS, mathematics, statistics, or computer science, international equivalent, or equivalent job experience. Required Skills 3 years of experience with Databricks Other required experience includes: SSIS/SSAS, Apache Spark, Python, R and SQL, SQL Server Preferred Skills Scala, DeltaLake Unity Catalog, Azure Logic Apps, Cloud Services Platform (e.g., GCP, or AZURE, or AWS) Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste This position participates in the support of batch and real-time data pipelines utilizing various data analytics processing frameworks in support of data science practices for Marketing and Finance business units. This position supports the integration of data from various data sources, as well as performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment. This position performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position participates and contributes to synthesizing disparate data sources to support reusable and reproducible data assets. Responsibilities Supervises and supports data engineering projects and builds solutions by leveraging a strong foundational knowledge in software/application development. Develops and delivers data engineering documentation. Gathers requirements, defines the scope, and performs the integration of data for data engineering projects. Recommends analytic reporting products/tools and supports the adoption of emerging technology. Performs data engineering maintenance and support. Provides the implementation strategy and executes backup, recovery, and technology solutions to perform analysis. Performs ETL tool capabilities with the ability to pull data from various sources and perform a load of the transformed data into a database or business intelligence platform. Required Qualifications Codes using programming language used for statistical analysis and modeling such as Python/Java/Scala/C# Strong understanding of database systems and data warehousing solutions. Strong understanding of the data interconnections between organizations’ operational and business functions. Strong understanding of the data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility Strong understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, Real-time predictions, insights and customer feedback, data security, data regulations, and compliance. Strong knowledge of data structures, as well as data filtering and data optimization. Strong understanding of analytic reporting technologies and environments (e.g., PBI, Looker, Qlik, etc.) Strong understanding of a cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages. Azure Preferred. Understanding of distributed systems and the underlying business problem being addressed, as well as guides team members on how their work will assist by performing data analysis and presenting findings to the stakeholders. Bachelor’s degree in MIS, mathematics, statistics, or computer science, international equivalent, or equivalent job experience. Required Skills 3 years of experience with Databricks Other required experience includes: SSIS/SSAS, Apache Spark, Python, R and SQL, SQL Server Preferred Skills Scala, DeltaLake Unity Catalog, Azure Logic Apps, Cloud Services Platform (e.g., GCP, or AZURE, or AWS) Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Join us as a Data Engineering Lead This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank We’re recruiting for multiple roles across a range to levels, up to and including experienced managers What you'll do We’ll look to you to demonstrate technical and people leadership to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering, leading a team of data engineers. We’ll Also Expect You To Be Working with Data Scientists and Analytics Labs to translate analytical model code to well tested production ready code Helping to define common coding standards and model monitoring performance best practices Owning and delivering the automation of data engineering pipelines through the removal of manual stages Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight Leading and delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists Leading and developing solutions for streaming data ingestion and transformations in line with streaming strategy The skills you'll need To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data. We'll also expect you to have knowledge of of big data platforms like Snowflake, AWS Redshift, Postgres, MongoDB, Neo4J and Hadoop, along with good knowledge of cloud technologies such as Amazon Web Services, Google Cloud Platform and Microsoft Azure You’ll Also Demonstrate Knowledge of core computer science concepts such as common data structures and algorithms, profiling or optimisation An understanding of machine-learning, information retrieval or recommendation systems Good working knowledge of CICD tools Knowledge of programming languages in data engineering such as Python or PySpark, SQL, Java, and Scala An understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Knowledge of messaging, event or streaming technology such as Apache Kafka Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Join us as a Machine Learning Engineer In this role, you’ll be driving and embedding the deployment, automation, maintenance and monitoring of machine learning models and algorithms Day-to-day, you’ll make sure that models and algorithms work effectively in a production environment while promoting data literacy education with business stakeholders If you see opportunities where others see challenges, you’ll find that this solutions-driven role will be your chance to solve new problems and enjoy excellent career development What you’ll do Your daily responsibilities will include you collaborating with colleagues to design and develop advanced machine learning products which power our group for our customers. You’ll also codify and automate complex machine learning model productions, including pipeline optimisation. We’ll expect you to transform advanced data science prototypes and apply machine learning algorithms and tools. You’ll also plan, manage, and deliver larger or complex projects, involving a variety of colleagues and teams across our business. You’ll Also Be Responsible For Understanding the complex requirements and needs of business stakeholders, developing good relationships and how machine learning solutions can support our business strategy Working with colleagues to productionise machine learning models, including pipeline design and development and testing and deployment, so the original intent is carried over to production Creating frameworks to ensure robust monitoring of machine learning models within a production environment, making sure they deliver quality and performance Understanding and addressing any shortfalls, for instance, through retraining Leading direct reports and wider teams in an Agile way within multi-disciplinary data and analytics teams to achieve agreed project and Scrum outcomes The skills you’ll need To be successful in this role, you’ll need to have a good academic background in a STEM discipline, such as Mathematics, Physics, Engineering or Computer Science. You’ll also have the ability to use data to solve business problems, from hypotheses through to resolution. We’ll look to you to have experience of at least twelve years with machine learning on large datasets, as well as experience building, testing, supporting, and deploying advanced machine learning models into a production environment using modern CI/CD tools, including git, TeamCity and CodeDeploy. You’ll Also Need A good understanding of machine learning approaches and algorithms such as supervised or unsupervised learning, deep learning, NLP with a strong focus on model development, deployment, and optimization Experience using Python with libraries such as NumPy, Pandas, Scikit-learn, and TensorFlow or PyTorch An understanding of PySpark for distributed data processing and manipulation with AWS (Amazon Web Services) including EC2, S3, Lambda, SageMaker, and other cloud tools. Experience with data processing frameworks such as Apache Kafka, Apache Airflow and containerization technologies such as Docker and orchestration tools such as Kubernetes Experience of building GenAI solutions to automate workflows to improve productivity and efficiency Show more Show less

Posted 6 days ago

Apply

Exploring Apache Jobs in India

Apache is a widely used software foundation that offers a range of open-source software solutions. In India, the demand for professionals with expertise in Apache tools and technologies is on the rise. Job seekers looking to pursue a career in Apache-related roles have a plethora of opportunities in various industries. Let's delve into the Apache job market in India to gain a better understanding of the landscape.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving IT sectors and see a high demand for Apache professionals across different organizations.

Average Salary Range

The salary range for Apache professionals in India varies based on experience and skill level. - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

In the Apache job market in India, a typical career path may progress as follows: 1. Junior Developer 2. Developer 3. Senior Developer 4. Tech Lead 5. Architect

Related Skills

Besides expertise in Apache tools and technologies, professionals in this field are often expected to have skills in: - Linux - Networking - Database Management - Cloud Computing

Interview Questions

  • What is Apache HTTP Server and how does it differ from Apache Tomcat? (medium)
  • Explain the difference between Apache Hadoop and Apache Spark. (medium)
  • What is mod_rewrite in Apache and how is it used? (medium)
  • How do you troubleshoot common Apache server errors? (medium)
  • What is the purpose of .htaccess file in Apache? (basic)
  • Explain the role of Apache Kafka in real-time data processing. (medium)
  • How do you secure an Apache web server? (medium)
  • What is the significance of Apache Maven in software development? (basic)
  • Explain the concept of virtual hosts in Apache. (basic)
  • How do you optimize Apache web server performance? (medium)
  • Describe the functionality of Apache Solr. (medium)
  • What is the purpose of Apache Camel? (medium)
  • How do you monitor Apache server logs? (medium)
  • Explain the role of Apache ZooKeeper in distributed applications. (advanced)
  • How do you configure SSL/TLS on an Apache web server? (medium)
  • Discuss the advantages of using Apache Cassandra for data management. (medium)
  • What is the Apache Lucene library used for? (basic)
  • How do you handle high traffic on an Apache server? (medium)
  • Explain the concept of .htpasswd in Apache. (basic)
  • What is the role of Apache Thrift in software development? (advanced)
  • How do you troubleshoot Apache server performance issues? (medium)
  • Discuss the importance of Apache Flume in data ingestion. (medium)
  • What is the significance of Apache Storm in real-time data processing? (medium)
  • How do you deploy applications on Apache Tomcat? (medium)
  • Explain the concept of .htaccess directives in Apache. (basic)

Conclusion

As you embark on your journey to explore Apache jobs in India, it is essential to stay updated on the latest trends and technologies in the field. By honing your skills and preparing thoroughly for interviews, you can position yourself as a competitive candidate in the Apache job market. Stay motivated, keep learning, and pursue your dream career with confidence!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies