Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 4.0 years
1 - 4 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
We Are looking for Computer Operator, Who can Perform defined tasks per documented instructions/process Male and Female Both Can Apply Fresher and Experience both can apply Basic computer Knowledge must Hardworking Work from Home.
Posted 2 weeks ago
0.0 - 2.0 years
1 - 4 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Calling clients to do tie up for data entry process Provide on call training and connect via Zoom Required Candidate profile Team-oriented and a collaborative team player. Good Communication Should have basic computer skills
Posted 2 weeks ago
1.0 - 4.0 years
2 - 5 Lacs
Hyderabad
Work from Office
ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Data Engineer who can own the design, development & maintenance of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain complex ETL/ELT data pipelines in Databricks using PySpark, Scala, and SQL to process large-scale datasets Understand the biotech/pharma or related domains & build highly efficient data pipelines to migrate and deploy complex data across systems Design and Implement solutions to enable unified data access, governance, and interoperability across hybrid cloud environments Ingest and transform structured and unstructured data from databases (PostgreSQL, MySQL, SQL Server, MongoDB etc.), APIs, logs, event streams, images, pdf, and third-party platforms Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Expert in data quality, data validation and verification frameworks Innovate, explore and implement new tools and technologies to enhance efficient data processing Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Minimum 5 to 8 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 3 weeks ago
3.0 - 7.0 years
4 - 7 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of Computer Science, IT or related field OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field OR Diploma and 10 to 12 years of Computer Science, IT or related field Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications (please mention if the certification is preferred or required for the role): AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 3 weeks ago
3.0 - 5.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 3 weeks ago
7.0 - 11.0 years
10 - 14 Lacs
Chennai
Work from Office
What you'll do As a Software Engineer, you will work with a world class team developing and deploying new technologies on a cutting edge network. You will design, develop and deploy new and innovative technology into a service provider network. Viasats unique position as a service provider and equipment manufacturer allows you to experience the whole life cycle of software development all the way from design to deployment. The day-to-day You will be a member of the software team that is involved in the embedded software development . It interacts with different network elements both on Access Network towards adapting with L2 Subsystem, CSN Network towards adapting with service network components. Our team members enjoy working closely with each other utilizing an agile development methodology. Priorities can change quickly, but our team members are able to stay ahead of deadlines to delight every one of our customers whether they are internal or external to Via sat. We are searching for candidates who enjoy working with people and have a technical mind that excels when being challenged What you'll need 7 to 11 years of software engineering experience in Java with strong emphasis on software architecture and design in the Unix/Linux based platforms. Experience with network programming and concurrent/multithreaded programming. Experience building CI/CD pipeline and automated software deployments. Experience working in cloud environment AWS EMR. Familiarity with Hadoop and data processing technologies such as Kafka is advantageous. Problem-solving experience and possess a DevOps approach Strong oral and written communication skills. Bachelors degree in Computer Science, Electrical Engineering, or related Engineering Disciplines. Up to 10% of travel. What will help you on the job Knowledge on tools like Jenkins, JIRA, and Git. Experience with bash, ansible and Python scripting in Linux Experience with telecom/networking/satellite/wireless communications.
Posted 3 weeks ago
3.0 - 6.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Job Details: : WCD IDG Tools Team is looking for a highly skilled and experienced Full Stack Developer to join our team. The ideal candidate will have strong expertise in both backend and frontend technologies, with a primary focus on C#, C++, Angular, and Python. You will be responsible for designing, developing, and maintaining complex web applications and services. Key Responsibilities: Design, develop, and maintain scalable web applications and services. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, efficient, and well-documented code in C#, C++, Python, and Angular. Develop and maintain APIs and microservices. Participate in architectural and technical decision-making. Review code, mentor junior developers, and promote best practices. Troubleshoot, debug, and upgrade existing systems. Ensure security, performance, and responsiveness of applications. Qualifications: Required Qualifications: 8+ years of professional software development experience. Proven expertise in C# (ASP.NET, .NET Core). Strong experience with C++ (preferably in high-performance or systems-level programming). Proficient in Python for backend services, scripting, or data processing. Solid experience with Angular for frontend development. Experience in RESTful API development and integration. Familiarity with database technologies (e.g., SQL Server, PostgreSQL, MongoDB). Understanding of CI/CD pipelines and DevOps practices. Strong understanding of software design patterns, data structures, and algorithms. Excellent problem-solving and communication skills. Experience with Agile/Scrum methodologies. Soft Skills: Strong problem-solving and debugging capabilities. Fast learner with a proactive, self-driven mindset. Excellent communication and documentation skills. Ability to work both independently and within a collaborative team. Job Type: Experienced Hire Shift: Shift 1 (India) Primary Location: India, Bangalore Additional Locations: Business group: The Client Computing Group (CCG) is responsible for driving business strategy and product development for Intel's PC products and platforms, spanning form factors such as notebooks, desktops, 2 in 1s, all in ones. Working with our partners across the industry, we intend to deliver purposeful computing experiences that unlock people's potential - allowing each person use our products to focus, create and connect in ways that matter most to them. As the largest business unit at Intel, CCG is investing more heavily in the PC, ramping its capabilities even more aggressively, and designing the PC experience even more deliberately, including delivering a predictable cadence of leadership products. As a result, we are able to fuel innovation across Intel, providing an important source of IP and scale, as well as help the company deliver on its purpose of enriching the lives of every person on earth. Posting Statement: All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance. Position of Trust N/A Work Model for this Role This role will require an on-site presence. *
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job Summary: We are seeking an experienced Data Engineer with expertise in Snowflake and PLSQL to design, develop, and optimize scalable data solutions. The ideal candidate will be responsible for building robust data pipelines, managing integrations, and ensuring efficient data processing within the Snowflake environment. This role requires a strong background in SQL, data modeling, and ETL processes, along with the ability to troubleshoot performance issues and collaborate with cross-functional teams. Responsibilities: Design, develop, and maintain data pipelines in Snowflake to support business analytics and reporting. Write optimized PLSQL queries, stored procedures, and scripts for efficient data processing and transformation. Integrate and manage data from various structured and unstructured sources into the Snowflake data platform. Optimize Snowflake performance by tuning queries, managing workloads, and implementing best practices. Collaborate with data architects, analysts, and business teams to develop scalable and high-performing data solutions. Ensure data security, integrity, and governance while handling large-scale datasets. Automate and streamline ETL/ELT workflows for improved efficiency and data consistency. Monitor, troubleshoot, and resolve data quality issues, performance bottlenecks, and system failures. Stay updated on Snowflake advancements, best practices, and industry trends to enhance data engineering capabilities. Required Skills: Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Hands-on expertise in PLSQL, including writing and optimizing complex queries, stored procedures, and functions. Proven ability to work with large datasets, data warehousing concepts, and cloud-based data management. Proficiency in SQL, data modeling, and database performance tuning. Experience with ETL/ELT processes and integrating data from multiple sources. Familiarity with cloud platforms such as AWS, Azure, or GCP is an added advantage. Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced) are a plus. Strong analytical skills, problem-solving abilities, and attention to detail. Excellent communication skills and ability to work effectively in a collaborative environment.
Posted 3 weeks ago
7.0 - 12.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Lead Python Developer Experience - 7+Years Location - Bangalore/Hyderabad Job Overview We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us Growth Opportunities You ll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!
Posted 3 weeks ago
3.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Senior Machine Learning Engineer - Recommender Systems Join our team at Thomson Reuters and contribute to the global knowledge economy. Our innovative technology influences global markets and supports professionals worldwide in making pivotal decisions. Collaborate with some of the brightest minds on diverse projects to craft next-generation solutions that have a significant impact. As a leader in providing intelligent information, we value the unique perspectives that foster the advancement of our business and your professional journey. Are you excited about the opportunity to leverage your extensive technical expertise to guide a development team through the complexities of full life cycle implementation at a top-tier companyOur Commercial Engineering team is eager to welcome a skilled Senior Machine Learning Engineer to our established global engineering group. Were looking for someone enthusiastic, an independent thinker, who excels in a collaborative environment across various disciplines, and is at ease interacting with a diverse range of individuals and technological stacks. This is your chance to make a lasting impact by transforming customer interactions as we develop the next generation of an enterprise-wide experience. About the Role: As a Machine Learning Engineer, you will: Spearhead the development and technical implementation of machine learning solutions, including configuration and integration, to fulfill business, product, and recommender system objectives. Create machine learning solutions that are scalable, dependable, and secure. Craft and sustain technical outputs such as design documentation and representative models. Contribute to the establishment of machine learning best practices, technical standards, model designs, and quality control, including code reviews. Provide expert oversight, guidance on implementation, and solutions for technical challenges. Collaborate with an array of stakeholders, cross-functional and product teams, business units, technical specialists, and architects to grasp the project scope, requirements, solutions, data, and services. Promote a team-focused culture that values information sharing and diverse viewpoints. Cultivate an environment of continual enhancement, learning, innovation, and deployment. About You: You are an excellent candidate for the role of Machine Learning Engineer if you possess: At least 3 years of experience in addressing practical machine learning challenges, particularly with Recommender Systems, to enhance user efficiency, reliability, and consistency. A profound comprehension of data processing, machine learning infrastructure, and DevOps/MLOps practices. A minimum of 2 years of experience with cloud technologies (AWS SageMaker, AWS is preferred), including services, networking, and security principles. Direct experience in machine learning and orchestration, developing intricate multi-tenant machine learning products. Proficient Python programming skills, SQL, and data modeling expertise, with DBT considered a plus. Familiarity with Spark, Airflow, PyTorch, Scikit-learn, Pandas, Keras, and other relevant ML libraries. Experience in leading and supporting engineering teams. Robust background in crafting data science and machine learning solutions. A creative, resourceful, and effective problem-solving approach. #LI-HG1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 3 weeks ago
0.0 - 1.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Software Engineering Associate Analyst - API Developer Position Overview Developer with hands on design and developer experience to build robust APIs and services using Java and Spring Boot, coupled with hands-on experience in data processing. Has knowledge and experience to design and implement scalable On Prem / Cloud solutions that efficientlymanage and leverage large datasets. Proficient in Java / Spring Boot with demonstrated ability to integrate with different databases and other APIs and services while ensuring security and best practices are followed throughout the development lifecycle. Responsibilities Design, develop, and maintain API’s using Java and Spring Boot and ensure efficient data exchange between applications. Implement API security measures including authentication, authorization, and rate limiting. Document API specifications and maintain API documentation for internal and external users. Develop integration with different data sources and other APIs / Web Services Develop integrations with IBM MQ and Kafka Develop / Maintain CI/CD pipelines Do performance evaluation and application tuning Monitor and troubleshoot application for stability and performance Qualifications Required Skills & Experience: 0 - 1 Years of experience Programming LanguagesProficiency in Java. Web DevelopmentExperience with SOAP and RESTful services. Database ManagementStrong knowledge of SQL (Oracle). Version ControlExpertise in using version control systems like Git. CI/CDFamiliarity with CI/CD tools such as GitLab CI and Jenkins. Containerization & OrchestrationExperience with Docker and OpenShift. Messaging QueuesKnowledge of IBM MQ and Apache Kafka. Cloud ServicesFamiliarity with cloud platforms such as AWS, Azure, or Google Cloud. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 3 weeks ago
2.0 - 7.0 years
10 - 15 Lacs
Pune
Work from Office
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. We are looking for a talented and experienced Software Engineer II to join our dynamic team. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Software Engineer II, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. We are seeking engineers with diverse specialties and skills to join our dynamic team to innovate and solve complex challenges. Our team is looking for strong talent with expertise in the following areas: Front End UI Engineer (UI/UX design principles, responsive design, JavaScript frameworks) DevOps Engineer (CI/CD Pipelines, IAC proficiency, Containerization/Orchestration, Cloud Platforms) Back End Engineer (API Development, Database Management, Security Practices, Message Queuing) AI/ML Engineer (Machine Learning Frameworks, Data Processing, Algorithm Development, Big Data Technologies, Domain Knowledge) Responsibilities: Software DevelopmentWrite clean, maintainable, and efficient code or various software applications and systems. Design and ArchitectureParticipate in design reviews with peers and stakeholders Code ReviewReview code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines TestingBuild testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and TroubleshootingTriage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and QualityContribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops ModelUnderstanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. DocumentationProperly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency in one or more programming languages such as C, C++, C#, .NET, Python, Java, or JavaScript. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with cloud platforms like Azure, AWS, or GCP. Experience with test automation frameworks and tools. Knowledge of agile development methodologies. Commitment to continuous learning and professional development. Good communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! Disability Accommodation in the Application and Interview Process UKGCareers@ukg.com
Posted 3 weeks ago
2.0 - 11.0 years
20 - 25 Lacs
Bengaluru
Work from Office
This is a Hybrid position located in Bangalore. You will be required to be onsite on an as-needed basis, typically 1 to 6 times a month. We are only considering candidates within a commutable distance and are not offering relocation assistance at this time About The Role: Develop and deploy cloud-based distributed applications, ensuring they are efficient, secure, and scalable. Build and optimize large-scale data processing pipelines using Spark/pySpark, integrating with cloud-native services. Monitor the performance and health of cloud-based applications and infrastructure, ensuring they meet performance, scalability, and security standards. Collaborate with developers, SDETs, Researchers, DevOps engineers, and system administrators to ensure smooth development, deployment, and operations of data-driven applications. Implement data analytics and dashboarding solutions to support business intelligence, operational monitoring, and model performance tracking. Contribute to the development, deployment, and monitoring of machine learning models, with a clear understanding of the end-to-end ML lifecycle, including data collection, preprocessing, model (re)training, evaluation, deployment, and feedback loops. About You: Proficient in programming languages such as Python and .NET, with experience Understanding of unit test frameworks in Python & .NET. Strong understanding of cloud platforms, particularly AWS and Google Cloud Platform (GCP), and their services. Solid grasp of AI and ML concepts, tools (MLFlow etc), and best practices. Understanding of end-to-end ML lifecycle and workflows in production-grade environments. Knowledge of cloud security best practices and the ability to implement them in application and infrastructure design. Familiarity with containerization and orchestration technologies such as Docker and Kubernetes. Competent in working with both relational and non-relational databases to support dynamic application needs. Company Overview McAfee is a leader in personal security for consumers. Focused on protecting people, not just devices, McAfee consumer solutions adapt to users needs in an always online world, empowering them to live securely through integrated, intuitive solutions that protects their families and communities with the right security at the right moment. Company Benefits and Perks: We work hard to embrace diversity and inclusion and encourage everyone at McAfee to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Bonus Program Pension and Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement Were serious about our commitment to diversity which is why McAfee prohibits discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.
Posted 3 weeks ago
2.0 - 7.0 years
5 - 9 Lacs
Mumbai
Work from Office
The purpose of this role is to execute data processing for our clients. This role is able to understand the input file requirements and output file requirements, along with data processing capabilities. The end result is a role that delivers data processing results for our clients. Job Description: Bachelors degree in Statistics, Mathematics, Computer Science, or a related field 2+years of experience working with UNCLE, Q, Dimension, SPSS syntax or similar data processing software. Experience working with data for large, multi-market complex projects Experience in data processing - including weighting, stacking and sig testing, with good understanding of industry best practices Should be able to perform extensive quality checks and data validation to ensure accuracy. Coordinate with internal project managers / client services team members to finalize materials (data processing spec forms); provide guidance on tool functionality and solutions. Develop and maintain data processing workflows and documentation. Microsoft Excel skills required with VBA macro writing experience/knowledge a plus Should be comfortable to work in night shifts - rotational, 24/7 operational support and working on weekends - Roaster Client-focused with strong consulting, communication, and collaboration skills. Emotionally intelligent, adept at conflict resolution, and thrives in high-pressure, fast-paced environments. Demonstrates ownership, problem-solving ability, and effective multitasking and prioritization Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 3 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Chennai
Work from Office
Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide. Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves. The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures. This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States. Responsibilities Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage). Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security. Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e.g. DOMO, Looker, Databricks) Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards. Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability. Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development. Requirements Bachelors degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e.g. Mathematics, Statistics, Engineering). 3+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e.g. Professional Cloud Developer, Professional Cloud Database Engineer). Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows. Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion. Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer. View our privacy policy, including our privacy notice to California residents here: https: / / www.five9.com / pt-pt / legal . Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9.
Posted 3 weeks ago
2.0 - 7.0 years
2 - 6 Lacs
Mumbai
Work from Office
The purpose of this role is to execute data processing for our clients. This role is able to understand the input file requirements and output file requirements, along with data processing capabilities. The end result is a role that delivers data processing results for our clients. Job Description: Bachelors degree in Statistics, Mathematics, Computer Science, or a related field 2+years of experience working with UNCLE, Q, Dimension, SPSS syntax or similar data processing software. Experience working with data for large, multi-market complex projects Experience in data processing - including weighting, stacking and sig testing, with good understanding of industry best practices Should be able to perform extensive quality checks and data validation to ensure accuracy. Coordinate with internal project managers / client services team members to finalize materials (data processing spec forms); provide guidance on tool functionality and solutions. Develop and maintain data processing workflows and documentation. Microsoft Excel skills required with VBA macro writing experience/knowledge a plus Should be comfortable to work in night shifts - rotational, 24/7 operational support and working on weekends - Roaster Client-focused with strong consulting, communication, and collaboration skills. Emotionally intelligent, adept at conflict resolution, and thrives in high-pressure, fast-paced environments. Demonstrates ownership, problem-solving ability, and effective multitasking and prioritization Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
This role is crucial for supporting our clients with advanced analytics and reporting capabilities. The Insights and Reporting Specialist will transform data into actionable insights, driving strategic decision-making and business growth. Utilizing tools like GCP/ SQL/ Python, Tableau/ Power BI/ Looker Studio, the specialist will ensure efficient data handling and visualization. Their expertise in Retail/ CPG along with strong communication skills, will enable them to address industry-specific challenges effectively. This position is associated with significant revenue and future opportunities, making it a strategic asset for our organization. Job Description: Responsibilities: Develop and maintain data tables (management, extraction, harmonizing etc.) using GCP/ SQL/ Snowflake etc. This involves designing, implementing, and writing optimized codes, maintaining complex SQL queries to extract, transform, and load (ETL) data from various tables/sources, and ensuring data integrity and accuracy throughout the data pipeline process. Create and manage data visualizations using Tableau/Power BI/ Looker Studio. This involves designing and developing interactive dashboards and reports, ensuring visualizations are user-friendly, insightful, and aligned with business requirements, and regularly updating and maintaining dashboards to reflect the latest data and insights.. Generate insights and reports to support business decision-making. This includes analyzing data trends and patterns to provide actionable insights, preparing comprehensive reports that summarize key findings and recommendations, and presenting data-driven insights to stakeholders to inform strategic decisions. Handle ad-hoc data requests and provide timely solutions. This involves responding to urgent data requests from various departments, quickly gathering, analyzing, and delivering accurate data to meet immediate business needs, and ensuring ad-hoc solutions are scalable and reusable for future requests. Collaborate with stakeholders to understand and solve open-ended questions. This includes engaging with business users to identify their data needs and challenges, working closely with cross-functional teams to develop solutions for complex, open-ended problems, and translating business questions into analytical tasks to deliver meaningful results. Design and create wireframes and mockups for data visualization projects. This involves developing wireframes and mockups to plan and communicate visualization ideas, collaborating with stakeholders to refine and finalize visualization designs, and ensuring that wireframes and mockups align with user requirements and best practices. Communicate findings and insights effectively to both technical and non-technical audiences. This includes preparing clear and concise presentations to share insights with diverse audiences, tailoring communication styles to suit the technical proficiency of the audience, and using storytelling techniques to make data insights more engaging and understandable. Perform data manipulation and analysis using Python. This includes utilizing Python libraries such as Pandas, NumPy, and SciPy for data cleaning, transformation, and analysis, developing scripts and automation tools to streamline data processing tasks, and conducting statistical analysis to generate insights from large datasets. Implement basic machine learning models using Python. This involves developing and applying basic machine learning models to enhance data analysis, using libraries such as scikit-learn and TensorFlow for model development and evaluation, and interpreting and communicating the results of machine learning models to stakeholders. Automate data processes using Python. This includes creating automation scripts to streamline repetitive data tasks, implementing scheduling and monitoring of automated processes to ensure reliability, and continuously improving automation workflows to increase efficiency. Requirements: 3 to 5 years of experience in data analysis, reporting, and visualization. This includes a proven track record of working on data projects and delivering impactful results and experience in a similar role within a fast-paced environment. Proficiency in GCP/ SQL/ Snowflake/ Python for data manipulation. This includes strong knowledge of GCP/SQL/Snowflake services and tools, advanced SQL skills for complex query writing and optimization, and expertise in Python for data analysis and automation. Strong experience with Tableau/ Power BI/ Looker Studio for data visualization. This includes demonstrated ability to create compelling and informative dashboards, and familiarity with best practices in data visualization and user experience design. Excellent communication skills, with the ability to articulate complex information clearly. This includes strong written and verbal communication skills, and the ability to explain technical concepts to non-technical stakeholders. Proven ability to solve open-ended questions and handle ad-hoc requests. This includes creative problem-solving skills and a proactive approach to challenges, and flexibility to adapt to changing priorities and urgent requests. Strong problem-solving skills and attention to detail. This includes a keen eye for detail and accuracy in data analysis and reporting, and the ability to identify and resolve data quality issues. Experience in creating wireframes and mockups. This includes proficiency in design tools and effectively translating ideas into visual representations. Ability to work independently and as part of a team. This includes being self-motivated and able to manage multiple tasks simultaneously and having a collaborative mindset and willingness to support team members. Basic machine learning skills. This includes an understanding of fundamental machine learning concepts and techniques and experience with libraries such as scikit-learn and TensorFlow etc Domain-specific knowledge in Retail/ CPG/ etc. This includes an understanding of industry-specific challenges and data requirements, experience in analyzing and reporting data within these domains, and being well-versed with key performance indicators (KPIs) relevant to any of these domains. This also includes applying domain knowledge to generate relevant insights and solutions. Location: DGS India - Bengaluru - Manyata H2 block Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 3 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Coimbatore
Work from Office
About Responsive Responsive (formerly RFPIO) is the global leader in strategic response management software, transforming how organizations share and exchange critical information. The AI-powered Responsive Platform is purpose-built to manage responses at scale, empowering companies across the world to accelerate growth, mitigate risk and improve employee experiences. Nearly 2,000 customers have standardized on Responsive to respond to RFPs, RFIs, DDQs, ESGs, security questionnaires, ad hoc information requests and more. Learn more at responsive.io. About the Role Responsive is looking for a product minded Software Engineer with strong technical skills and a passion for building scalable solutions. This is an opportunity to work in a fast-paced, innovative environment and contribute to the growth of a top-tier SaaS company. What You ll Be Doing Distributed Systems Development: Design, develop, and maintain Responsive application ensuring high performance, scalability, and reliability. Performance Tuning: Monitor and optimize performance, addressing bottlenecks and ensuring low-latency query responses. Collaboration: Work closely with cross-functional and geographically distributed teams, including product managers, frontend engineers, and UX designers, to deliver seamless and intuitive experiences. Continuous Improvement: Stay updated with the latest trends and advancements in technologies, conducting research and experimentation to drive innovation What We re Looking For Education: Bachelor s degree in Computer Science, Information Technology, or a related field. Experience: 2+ years of experience in software design, development, and algorithm-related solutions using Java and related technologies. Skills, Qualifications & Ability: Strong proficiency in Java programming, Java design patterns, and server-side Java development Demonstrated versatility in multiple front-end and back-end technologies such as Spring, React, MongoDB,, etc. Experience working with cloud platforms such as GCP, Azure, or AWS. Knowledge of cloud-native services for AI/ML, data storage, and processing. Expertise in search and retrieval technologies, including search platforms like ElasticSearch, Apache Solr, or similar, with experience in AI-driven solutions, Natural Language Processing (NLP), semantic search, and text processing techniques is a plus. Proficiency in Test Driven Development (TDD). Scrum and JIRA experience is a plus. Experience working in a fast-paced, dynamic environment, preferably in a SaaS or technology-driven company. Why Join Us? Impact-Driven Work: Build innovative solutions that redefine strategic response management Collaborative Environment: Work with a passionate team of technologists, designers, and product leaders Career Growth: Be part of a company that values learning and professional development. Competitive Benefits: We offer comprehensive compensation and benefits to support our employees. Trusted by Industry Leaders: Be part of a product that is trusted by world-leading organizations Cutting-Edge Technology: Work on AI-driven solutions, cloud-native architectures, and large-scale data processing Diverse and Inclusive Workplace: Collaborate with a global team that values different perspectives and ideas
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Patna
Work from Office
Position Reporting State Impact Sr. Specialist, Operational reporting to District Operations Lead Position Reportees Impact Associate; Data Entry Operator; Impact Trainee Vision- We aim to achieve behavioural, social and economic transformation for all girls towards an India where all children have equal opportunities to access quality education. Background- Educate Girls (a project of Foundation to Educate Girls Globally ) is a non-profit organization that focuses on mobilizing communities for girls education in India s rural and educationally backward areas. Strongly aligned with Right to Education Act or the Samagra Shiksha the organization is committed to the Government s vision to improve access to primary education for children, especially young girls. Educate Girls currently operates successfully in over 20,000+ villages in Rajasthan, Madhya Pradesh and Uttar Pradesh. By leveraging the Government s existing investment in schools and by engaging with a huge base of community volunteers, Educate Girls helps to identify, enrol and retain out-of-school girls and to improve foundational skills in literacy and numeracy for all children (both girls and boys). This helps deliver measurable results to a large number of children and avoids parallel delivery of Services. We are at www.educategirls.ngo for detailed information on our vision, mission and programs. Position Overview: The position is responsible for implementation all Monitoring & Evaluation activities in the district. The incumbent will work closely on technical aspects with State Impact Sr. Specialist to track the progress of the program on the basis of performance indicators. The incumbent will be responsible for ensuring timely collection, analysis and dissemination of all M&E data collected in the district. The position will also be responsible to support the district lead in all data analysis for decision making and train the district team on quality parameters. The position involves intense travel in operational areas especially in blocks and villages based on programmatic need. Position Key Responsibilities: Strategy, Planning and review Develop and support in district plan as well as impact function plan and plan for regular review on defined indicators Development of Impact Calendar for all impact team members Design actionable M&E micro-plans on an annual and contingent basis as per the M&E framework Plan and develop strategy for M&E implementation and course correction for assigned district. Manage the implementation of impact assessment activities in the assigned district Support to Design/development & ensure roll out of data collections where required and not available in core systems (AGP/Phase III/ COVID). Ensure roll out/implement of new tools and pilot the same as per monitoring and evaluation plan Prepare detailed and up-to-date monitoring findings and data base on the project course correction in order to obtain optimal outcomes; Supervision of monitoring activity planning and scheduling action plan for implementation of M & E activities Ensure the collection of program information on key indicators from the community, schools and other stakeholders for enhancement of the program. Uphold the quality assurance with data and program delivery for respective district and ensure timely and quick course corrections in data and activities through sharing feedback and insights to program operations. Prepare detailed and up-to-date monitoring findings and data base on the project course correction in order to obtain optimal outcomes Maintain high standards for rigor, data quality and related good practices within the intervention-based data collection and processing; Ensure updates available from start of activity and contain timely red flags and actionable insights. Coordinating with district impact staff for tracking data collection and on time data entry Verification of the data collected by field staff (as per sample) for validation Managing web-based MIS portal and mobile application for data management Process validation and spot checks of School Management Committees, Mohalla Meeting, Gram Shiksha Sabha, Bal Sabha and School Improvement Plans Monitor the field visits of district impact team for cross verification and spot checks Monthly quality monitoring updates, including QA done and insights cascaded from FC to block to district to region. Data monitor, data entry finalization and data approval and preparation done in time for internal/donor/govt reporting Responsible for preparation and compilation of Monthly Reports and present data and insights in monthly meetings at district and regional levels. Attend block level meetings for handholding of field staff Training and people development Ensure timely training of district/field staff on new processes Training of staff including volunteers on data collection, recording, reporting and data processing, functional induction etc. Build capability for remote monitoring, embed into all QM activities. Identify training needs of members of the Impact team and recommend training plans including technical and other trainings Responsible for efficiency of data collection and analysis process; and strengthening use of data to improve quality Ensure budget utilization (team travel tracking against budget) within specified limits Participation in training programs organized by Program team Support in managing donor visits/internal & external visits Ensure that all Impact unit is staffed as per planned position requirements in the district Ensure that staff appraisals are conducted for the Impact team as per periodicity specified by the organization s HR policy Support to IT team in roll out of PMS Mobile Application Support to other functions in terms of different data requirements Desired Incumbent Profile: Personality: Self-driven, result-oriented with a positive outlook and a clear focus on high quality output. Excellent conceptual and analytical skills. Demonstrable ability to think strategically, innovatively and practically to ensure achievement of desired change objectives. Proactive approach to problem-solving with strong decision-making capability. Strong organizational skills that reflect ability to perform and prioritize multiple tasks seamlessly with excellent attention to detail. Strong interpersonal skills and the ability to build relationships with multiple stakeholders. Empathic communicator, ability to see things from the other persons point of view. Ability to get along with variety of individuals and a team-player Sufficiently mobile and flexible to manage intense travel in operational areas especially in blocks and villages which could amount to 50-60% of the time based on programmatic need. Work Life Balance: Must be mature and domestically secure. Able to manage travel without upsetting domestic situation. Able to work extended hours on occasions when required. Technology skills: Must be adept in use of MS Office, particularly Excel, Word/Power Point, and ideally Access or similar database to basic level, Internet and email. Open for learning and adapting to new technologies being introduced in the organization. Sound contextual knowledge of local issues, organizational relationships, social and cultural constraints and realities, and environmental conditions, Right to Education, Child Psychology, and Community Motivation Ability to manage large variety of data and excellent data analysis skills, research software proficiency, knowledge of software application/web/android friendly. Able to run data management tool or modules or software Demonstrated ability to cultivate relationships, collaborate with individuals in a culturally diverse setting and build consensus; Ability to multitask and perform under stress situation Ability to treat people equally irrespective of gender Integrity towards the work and ability to know & do what is right Working effectively and inclusively with a range of people both within and outside of the organization Empathy Adherence to Code of Conduct & EG Policies: All existing & new employees shall ensure that they at all times act in compliance with EG s laid down Code of Conduct & adhere to all Policies of EG, including but not limited to Workplace Harassment Policy, Sexual Harassment Prevention and Redressal Policy, Child Protection Policy, Code of Conduct Policy, Whistleblower Policy, Work from Home Policy, Diversity and Inclusion Policy etc. EG has a zero-tolerance policy for all forms of discrimination. Preferred Education Background: Graduate in Economics, Statistics, Social Sciences or related field, preferred Fluent in Hindi and local dialect and basic knowledge of English Preferred Work Experience: Minimum 3-5 years of experience in relevant field, experience in data analysis and data mining.
Posted 3 weeks ago
2.0 - 7.0 years
4 - 8 Lacs
Mumbai
Work from Office
The purpose of this role is to execute data processing for our clients. This role is able to understand the input file requirements and output file requirements, along with data processing capabilities. The end result is a role that delivers data processing results for our clients. Job Description: Bachelors degree in Statistics, Mathematics, Computer Science, or a related field 2+years of experience working with UNCLE, Q, Dimension, SPSS syntax or similar data processing software. Experience working with data for large, multi-market complex projects Experience in data processing - including weighting, stacking and sig testing, with good understanding of industry best practices Should be able to perform extensive quality checks and data validation to ensure accuracy. Coordinate with internal project managers / client services team members to finalize materials (data processing spec forms); provide guidance on tool functionality and solutions. Develop and maintain data processing workflows and documentation. Microsoft Excel skills required with VBA macro writing experience/knowledge a plus Should be comfortable to work in night shifts - rotational, 24/7 operational support and working on weekends - Roaster Client-focused with strong consulting, communication, and collaboration skills. Emotionally intelligent, adept at conflict resolution, and thrives in high-pressure, fast-paced environments. Demonstrates ownership, problem-solving ability, and effective multitasking and prioritization Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 3 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Summary Guidewire is searching for a unique individual who is ambitious, curious, and hungry for a rare chance to transform a 500-year-old industry from the inside out. Through our data listening capabilities, we collect more data (and more important data) than any other company in our market. We seek ways to make sense of it, showcase it, and transform it into insight that feeds billions of decision points every year across pricing, portfolio management, underwriting, claims management, and risk transfer. At Guidewire, we offer a combination of good working conditions, an excellent market opportunity, a rational and meritocratic company culture, quality software products, and a long history of careful hiring have allowed us to create an enviable work environment. Guidewire Analytics helps insurers and other financial institutions to model new and evolving risks such as cyber. By combining internet-scale data listening, adaptive machine learning, and insurance risk modeling, Guidewire Analytics insights help P&C customers face new risks, take advantage of new opportunities and develop new products. Job Description Responsibilities : Development: Develop robust, scalable, and efficient data pipelines. Manage platform solutions to support data engineering needs to ensure seamless integration and performance. Write clean, efficient, and maintainable code. Data Management and Optimization: Ensure data quality, integrity, and security across all data pipelines. Optimize data processing workflows for performance and cost-efficiency. Develop and maintain comprehensive documentation for data pipelines and related processes. Innovation and Continuous Improvement: Stay current with emerging technologies and industry trends in big data and cloud computing. Propose and implement innovative solutions to improve data processing and analytics capabilities. Continuously evaluate and improve existing data infrastructure and processes. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. 2+ years of experience in software engineering with a focus on data engineering and building data platform Strong programming experience using Python or Java . Experience of the Big data technologies like Apache Spark , Amazon EMR , Apache Iceberg , Amazon Redshift , etc or Similar technologies Experience in RDBMS (Postgres, MySql, etc) and NoSQL (MongoDB, DynamoDB, etc) database Experience in AWS cloud services (e.g., Lambda , S3 , Athena , Glue ) or comparable cloud technologies. Experience in CI/CD . Experience working in Event driven and Serverless Architecture Experience with platform solutions and containerization technologies (e.g., Docker , Kubernetes). Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. Strong communication skills, both written and verbal.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark / Databricks / bigquery / Airflow / composer. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Your Key Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views/MV. Participate in data migration projects and understand technologies like Delta Lake/warehouse/bigquery. Debug and solve complex problems in data pipelines and processes. Your skills and experience that will help you excel Bachelor s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. . . To all recruitment agencies . Note on recruitment scams
Posted 3 weeks ago
2.0 - 10.0 years
10 - 11 Lacs
Gurugram
Work from Office
About Us What s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Data - Project Manager & Processing will be responsible for analytics related support for the functional analytics team & to manage data projects, working in close co-ordination with the Insights & Reporting team. Role Accountability Program Execution - Person is responsible to execute all the program initiatives undertaken in data vertical by managing development/SI partners to ensure the execution of projects in a timely manner with expected quality. He/She should be able to create project plan, drive both business & IT teams to work with plan, identify project/program risks and put mitigations or do timely escalation for help needed. Assist the Program Delivery Leader in creating a team with data processing skills with good understanding of Python, SAS, SQL, Tableau or other analytical tools using which they should assist Insights & Reporting team if they need help with extraction and processing of data. Maintain detailed project documentation, including project charters, status reports, FSD, TSD, etc. and ensure project is handed over to the Insights and Reporting team upon successful completion Work with business teams and Data Lake technology team and lead the programs and data initiatives arising due to new needs arising from business, audits, regulatory Actively participate in the new product initiatives and provide data requirements to be implemented for NPIs and ensure that same are implemented for appropriate data insights and analytics Role is required to interact and collaborate with multiple functions ensuring that their data requirements are correctly captured, analyzed and implemented along with new initiatives. Person is required to ensure that data requirements are not missed out in new initiatives and will ensure that proper thought process is applied in coming up with data requirements considering audits and regulatory reporting Ensure technical support is provided to Insights and reporting team wherever required to meet the data extraction & analysis requirements Person is required to build a strong understanding of data processes across the card lifecycle, how and where the data is stored across multiple layers of data platforms Collaborate with senior leadership team, Function heads and BIU Program management team to understand their data needs and deliver the same through the implementation of data initiatives and projects He/She will be responsible to drive periodic meetings with business leaders to identify data projects and work closely with IT for its implementation Person is required to build a strong understanding of data processes across the card lifecycle, how and where the data is stored across multiple layers of data platform As a People Manager, person is required to manage & lead the team with direct reportees of up to 5 team members Measures of Success Deliver data projects On Time, Within Approved Budget with no P1 defects in production Technical Skills / Experience / Certifications Good knowledge of SAS, Python, SQL and Tableau Good understanding of ETL tools and processes Good understanding of project management methodology Competencies critical to the role Person should have strong experience of delivering multiple programs and leading teams preferably in BFSI segment Good knowledge of business processes & key business metrics to provide effective solutions Person should have good knowledge in preparing High, Mid and low-level project plan. He/She should be good in Microsoft project management tool Person is required to lead cross functional teams to drive data projects and execute data processing asks He should be Strong team player - Inclusive who can collaborate with multiple teams and drive them towards achieving a common goal Strong analytical skills - strong problem-solving skills, communicates in a clear and succinct manner and effectively evaluates information / data to make decisions; anticipates obstacles and develops plans to resolve Demonstrated customer focus - evaluates decisions through the eyes of the customer; builds strong relationships and creates processes which helps with timely availability of data to all stakeholders Should have very good written and verbal communication skills Qualification B.E / MCA in Computer Science/Graduate or PG from good institute. PMP Certification desired Preferred Industry BFSI
Posted 3 weeks ago
0.0 - 3.0 years
9 - 10 Lacs
Noida
Work from Office
Job Description: Data Analyst Location: Noida, Uttar Pradesh, IND Our mission is to unlock human potential. We welcome you for who you are, the background you bring, and we embrace individuals who get excited about learning. Bring your experiences, your perspectives, and your passion; it s in our differences that we empower the way the world learns. About the Role: The Enterprise Data & Analytics team provides colleagues with actionable analyses, utilizing multiple data sources to provide insight into product performance and business opportunities while helping to inform product and business development. As a Data Analyst in the Enterprise Data & Analytics team, you will be responsible for providing data, reporting tools, analyses, and actionable insights to support the strategic work of the Research & Learning publishing teams. You will ensure that the data that Research needs to monitor their business and make strategic decisions is available and accessible. You will produce high-quality reporting products, data visualizations, and analytics that expose publishing and product performance data in the most appropriate format to end users throughout Research Publishing. You will partner with data product owners, provide subject matter expertise, and apply industry best practices to the creation, development, and deployment of reporting and visualization tools to achieve this. The role will require ongoing collaboration with a wide range of colleagues globally and at various levels of the business. You will need a background in data manipulation, storage, and retrieval, as well as experience in data visualization and effective report compilation. How will you make an impact: Build, maintain, and develop high-quality BI reports, primarily using Power BI. Provide data, insights, and reporting tools to meet the current and future requirements of key stakeholder groups in support of their business objectives. Collaborate closely with stakeholder to ensure that current reporting needs are met and that future requirements are understood and planned for. Apply a product management approach to the ideation, development, enhancement, feature release, and consolidation/sunset of reporting and data visualization tools. Identify sources, gather and prepare data, and extract relevant data sets, from a variety of sources. Slice and rework existing datasets to provide a new lens on the data via simple data processing and visualization. Meet the varied needs of your stakeholders, prioritizing user features and applying best practices on deployment and enablement. Build subject matter expertise on new publishing workflows and accurate data sources. Ensure all processes, products, data, scripts, and schedules are fully documented in appropriate tools such as Confluence and GitHub. Provide documentation and training materials to colleagues, giving them the ability to conduct their own analyses and interpret data safely. Contribute to clarifying and summarizing the data needs of the Research Publishing stakeholders and ensure data availability. Work with the data owners to support knowledge discovery on new data and pipeline enhancements, troubleshoot, and provide UAT and QA on data sources. What we look for: Strong analytical mindset, data literate, and inquisitive. Experience in data manipulation, database management, SQL, and data visualization skills. Customer-focused approach - an ability to think about reporting from a real-life, author or editors point of view, outside of the structure of a database or business reporting line, for correct interpretation. Stakeholder communication and expectation management skills. Product management experience, particularly in reporting technology or data visualization, is a bonus. About Wiley: Wiley is a trusted leader in research and learning, our pioneering solutions and services are paving the way for knowledge seekers as they work to solve the worlds most important challenges. We are advocates of advancement, empowering knowledge-seekers to transform todays biggest obstacles into tomorrows brightest opportunities. With over 200 years of experience in publishing, we continue to evolve knowledge seekers steps into strides, illuminating their path forward to personal, educational, and professional success at every stage. Around the globe, we break down barriers for innovators, empowering them to advance discoveries in their fields, adapt their workforces, and shape minds. Wiley is an equal opportunity/affirmative action employer. We evaluate all qualified applicants and treat all qualified applicants and employees without regard to race, color, religion, sex, sexual orientation, gender identity or expression, national origin, disability, protected veteran status, genetic information, or based on any individuals status in any group or class protected by applicable federal, state or local laws. Wiley is also committed to providing reasonable accommodation to applicants and employees with disabilities. Applicants who require accommodation to participate in the job application process may contact tasupport@wiley.com for assistance. We are proud that our workplace promotes continual learning and internal mobility. Our values support courageous teammates, needle movers and learning champions all while striving to support the health and well-being of all employees, for example we offer meeting-free Friday afternoons allowing more time for heads down work and professional development. We are committed to fair, transparent pay, and we strive to provide competitive compensation in addition to a comprehensive benefits package. The range below represents Wileys good faith and reasonable estimate of the base pay for this role at the time of posting roles either in the UK, Canada or USA. It is anticipated that most qualified candidates will fall within the range, however the ultimate salary offered for this role may be higher or lower and will be set based on a variety of non-discriminatory factors, including but not limited to, geographic location, skills, and competencies. Wiley proactively displays target base pay range for UK, Canada and USA based roles. When applying, please attach your resume/CV to be considered. #LI
Posted 3 weeks ago
9.0 - 15.0 years
15 - 19 Lacs
Chennai
Work from Office
Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs. Build ETL pipelines to ingest the data from heterogeneous sources into our system. Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data. Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets. Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements. Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments. Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures. Troubleshoot and resolve issues related to data processing, storage, and retrieval. Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle. Implement security measures and data governance policies to ensure the integrity and confidentiality of data. Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives. Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance. Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems. Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment. Gcp, Sql, Design & Architecture, Etl, Pubsub, Bigquery, Etl & Data Engineering, Cicd
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane