Jobs
Interviews

3093 Data Processing Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 6.0 years

1 - 6 Lacs

Bengaluru, Karnataka, India

On-site

As a Food & Beverage Sales Executive , your primary responsibility is to handle the business of the Food & Beverage outlets in-house, focusing on maximizing sales opportunities and developing customer relationships. Under the supervision of the Director of Food & Beverage (FB) or Food & Beverage Manager, and in coordination with the Director of Operations, you will: Sales Implementation : Implement all sales action plans related to the respective market and ensure they align with the hotel's business strategy. Invoicing and Service Correspondence : Ensure that the invoicing aligns accurately with all services agreed upon and rendered to clients. Revenue Maximization : Drive Food & Beverage revenue by identifying and capitalizing on upselling opportunities. Promotion of the Hotel : Actively promote the hotel through hosting, conducting site inspections, and presenting hotel services to potential clients. Client Relationship Management : Regularly visit existing, former, and potential clients to establish contracts, particularly focusing on commercial accounts. VIP & Key Account Management : Meet and accompany top key accounts and VIP guests upon arrival, ensuring their satisfaction and offering personalized service. After-Sales Service : Provide excellent after-sales service, ensuring that guest complaints are taken seriously and resolved promptly. Operations Awareness : Stay well-informed about the operations of all Food & Beverage outlets, particularly key departments that are critical to sales success. Database Management : Maintain an up-to-date record of current, former, and potential Food & Beverage guests, and utilize the guest database tool to manage and build profiles. Sales Tracking : Prepare a monthly schedule to record sales activities and outcomes for the preceding month. Guest Interaction : Regularly engage with guests in Food & Beverage outlets and executive lounges, targeting specific guests for potential sales opportunities. Feedback Collection : Collect guest feedback, organize meetings with the Food & Beverage team to share insights, and implement improvements as necessary. Market Awareness : Stay informed on market trends, competitors, and related promotions, reporting key findings to the Director of FB or FB Manager in a timely manner. Exposure and Marketing : Maintain a high level of exposure for the hotel in major market areas through sales calls, written communications, and joint sales initiatives. Sales Call Recordkeeping : Record all daily sales calls and submit monthly production reports on your list of accounts. Other Duties : Carry out any other reasonable duties and responsibilities as assigned by management. What are we looking for To successfully fill the role of Food & Beverage Sales Executive at Hilton, you should: Sales-Oriented Mindset : Have a strong focus on achieving sales targets and identifying new revenue opportunities. Excellent Communication Skills : Be able to build and maintain relationships with clients, providing top-notch customer service. Problem-Solving Abilities : Handle guest complaints effectively and work to resolve issues to the satisfaction of all parties involved. Organizational Skills : Be able to manage multiple tasks efficiently, including tracking sales activities and maintaining guest records. Market Awareness : Stay on top of market trends and competitor activities to inform sales strategies. Team Collaboration : Work effectively with other team members and departments to ensure the best possible guest experience. Guest-Focused Attitude : Demonstrate a strong commitment to guest satisfaction and ensure every guest's needs are met promptly and professionally. If you are passionate about food and beverage sales and have a track record of building strong customer relationships while driving revenue, this is the ideal role for you.

Posted 1 week ago

Apply

1.0 - 6.0 years

1 - 6 Lacs

Bengaluru, Karnataka, India

On-site

As a Food & Beverage Sales Executive , your primary responsibility is to handle the business of the Food & Beverage outlets in-house, focusing on maximizing sales opportunities and developing customer relationships. Under the supervision of the Director of Food & Beverage (FB) or Food & Beverage Manager, and in coordination with the Director of Operations, you will: Sales Implementation : Implement all sales action plans related to the respective market and ensure they align with the hotel's business strategy. Invoicing and Service Correspondence : Ensure that the invoicing aligns accurately with all services agreed upon and rendered to clients. Revenue Maximization : Drive Food & Beverage revenue by identifying and capitalizing on upselling opportunities. Promotion of the Hotel : Actively promote the hotel through hosting, conducting site inspections, and presenting hotel services to potential clients. Client Relationship Management : Regularly visit existing, former, and potential clients to establish contracts, particularly focusing on commercial accounts. VIP & Key Account Management : Meet and accompany top key accounts and VIP guests upon arrival, ensuring their satisfaction and offering personalized service. After-Sales Service : Provide excellent after-sales service, ensuring that guest complaints are taken seriously and resolved promptly. Operations Awareness : Stay well-informed about the operations of all Food & Beverage outlets, particularly key departments that are critical to sales success. Database Management : Maintain an up-to-date record of current, former, and potential Food & Beverage guests, and utilize the guest database tool to manage and build profiles. Sales Tracking : Prepare a monthly schedule to record sales activities and outcomes for the preceding month. Guest Interaction : Regularly engage with guests in Food & Beverage outlets and executive lounges, targeting specific guests for potential sales opportunities. Feedback Collection : Collect guest feedback, organize meetings with the Food & Beverage team to share insights, and implement improvements as necessary. Market Awareness : Stay informed on market trends, competitors, and related promotions, reporting key findings to the Director of FB or FB Manager in a timely manner. Exposure and Marketing : Maintain a high level of exposure for the hotel in major market areas through sales calls, written communications, and joint sales initiatives. Sales Call Recordkeeping : Record all daily sales calls and submit monthly production reports on your list of accounts. Other Duties : Carry out any other reasonable duties and responsibilities as assigned by management. What are we looking for To successfully fill the role of Food & Beverage Sales Executive at Hilton, you should: Sales-Oriented Mindset : Have a strong focus on achieving sales targets and identifying new revenue opportunities. Excellent Communication Skills : Be able to build and maintain relationships with clients, providing top-notch customer service. Problem-Solving Abilities : Handle guest complaints effectively and work to resolve issues to the satisfaction of all parties involved. Organizational Skills : Be able to manage multiple tasks efficiently, including tracking sales activities and maintaining guest records. Market Awareness : Stay on top of market trends and competitor activities to inform sales strategies. Team Collaboration : Work effectively with other team members and departments to ensure the best possible guest experience. Guest-Focused Attitude : Demonstrate a strong commitment to guest satisfaction and ensure every guest's needs are met promptly and professionally. If you are passionate about food and beverage sales and have a track record of building strong customer relationships while driving revenue, this is the ideal role for you.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Java Developer, you will be responsible for analyzing, designing, programming, debugging, and modifying software enhancements and/or new products used in various computer programs. Your expertise in Java, Spring MVC, Spring Boot, Database design, and query handling will be utilized to write code, complete programming, and perform testing and debugging of applications. You will work on local, networked, cloud-based, or Internet-related computer programs, ensuring the code meets the necessary standards for commercial or end-user applications such as materials management, financial management, HRIS, mobile apps, or desktop applications products. Your role will involve working with RESTful Web Services/Microservices for JSON creation, data parsing/processing using batch and stream mode, and messaging platforms like Kafka, Pub/Sub, ActiveMQ, among others. Proficiency in OS, Linux, virtual machines, and open source tools/platforms is crucial for successful implementation. Additionally, you will be expected to have an understanding of data modeling and storage with NoSQL or relational DBs, as well as experience with Jenkins, Containerized Microservices deployment in Cloud environments, and Big Data development (Spark, Hive, Impala, Time-series DB). To excel in this role, you should have a solid understanding of building Microservices/Webservices using Java frameworks, REST API standards and practices, and object-oriented analysis and design patterns. Experience with cloud technologies like Azure, AWS, and GCP will be advantageous. A candidate with Telecom domain experience and familiarity with protocols such as TCP, UDP, SNMP, SSH, FTP, SFTP, Corba, SOAP will be preferred. Additionally, being enthusiastic about work, passionate about coding, a self-starter, and proactive will be key qualities for success in this position. Strong communication, analytical, and problem-solving skills are essential, along with the ability to write quality/testable/modular code. Experience in Big Data platforms, participation in Agile Development methodologies, and working in a start-up environment will be beneficial. Team leading experience is an added advantage, and immediate joiners will be given special priority. If you possess the necessary skills and experience, have a keen interest in software development, and are ready to contribute to a dynamic team environment, we encourage you to apply for this role.,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

delhi

On-site

As a Snowflake Solution Architect, you will be responsible for owning and driving the development of Snowflake solutions and products as part of the COE. Your role will involve working with and guiding the team to build solutions using the latest innovations and features launched by Snowflake. Additionally, you will conduct sessions on the latest and upcoming launches of the Snowflake ecosystem and liaise with Snowflake Product and Engineering to stay ahead of new features, innovations, and updates. You will be expected to publish articles and architectures that can solve business problems for businesses. Furthermore, you will work on accelerators to demonstrate how Snowflake solutions and tools integrate and compare with other platforms such as AWS, Azure Fabric, and Databricks. In this role, you will lead the post-sales technical strategy and execution for high-priority Snowflake use cases across strategic customer accounts. You will also be responsible for triaging and resolving advanced, long-running customer issues while ensuring timely and clear communication. Developing and maintaining robust internal documentation, knowledge bases, and training materials to scale support efficiency will also be a part of your responsibilities. Additionally, you will support with enterprise-scale RFPs focused around Snowflake. To be successful in this role, you should have at least 8 years of industry experience, including a minimum of 3 years in a Snowflake consulting environment. You should possess experience in implementing and operating Snowflake-centric solutions and proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform. An understanding of the complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools is essential. Strong skills in databases, data warehouses, data processing, as well as extensive hands-on expertise with SQL and SQL analytics are required. Familiarity with data science concepts and Python is a strong advantage. Knowledge of Snowflake components such as Snowpipe, Query Parsing and Optimization, Snowpark, Snowflake ML, Authorization and Access control management, Metadata Management, Infrastructure Management & Auto-scaling, Snowflake Marketplace for datasets and applications, as well as DevOps & Orchestration tools like Airflow, dbt, and Jenkins is necessary. Possessing Snowflake certifications would be a good-to-have qualification. Strong communication and presentation skills are essential in this role as you will be required to engage with both technical and executive audiences. Moreover, you should be skilled in working collaboratively across engineering, product, and customer success teams. This position is open in all Xebia office locations including Pune, Bangalore, Gurugram, Hyderabad, Chennai, Bhopal, and Jaipur. If you meet the above requirements and are excited about this opportunity, please share your details here: [Apply Now](https://forms.office.com/e/LNuc2P3RAf),

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

As a Senior Software Engineer with 12-16 years of experience in supply chain retail technology, you will be joining the Fulfilment Optimization team at Wayfair. The team is responsible for building platforms that determine how customer orders are fulfilled, focusing on optimizing Wayfair profitability and customer satisfaction. Your role will involve enhancing and scaling customer-facing platforms that provide fulfillment information on the website, ensuring the accurate representation of the dynamic supply chain, and surfacing the necessary information for customers, suppliers, and carriers in real-time. In this position, you will partner with business stakeholders to provide transparency, data, and resources for informed decision-making. You will be a technical leader within and across the teams, driving high-impact architectural decisions and hands-on development following best design and coding practices. Your responsibilities will include ensuring production readiness, identifying risks and proposing solutions, contributing to the team's strategy and roadmap, and fostering a culture of continuous learning and innovation. To be successful in this role, you should have a Bachelor's Degree in Computer Science or a related field, along with at least 12 years of experience in a senior engineer or technical lead role. You should have mentored a team of 10-12 people and possess expertise in developing and designing scalable distributed systems. Strong communication skills, the ability to collaborate effectively with cross-functional teams, and a passion for mentoring and leading engineers are essential qualities for this role. Experience in designing APIs and microservices, working with cloud technologies (specifically GCP), data processing, data pipelines, and familiarity with common open-source platforms and tools such as Kafka, Kubernetes, Java microservices, and GraphQL APIs are highly desirable. Additionally, experience in designing and developing recommendation systems, productionalizing ML models, and working with event-driven systems and technologies would be advantageous. Joining Wayfair means being part of one of the world's largest online destinations for home goods. With a commitment to industry-leading technology and creative problem-solving, Wayfair offers rewarding career opportunities for individuals seeking rapid growth, continuous learning, and dynamic challenges. If you are looking to be a part of a team that is reinventing the way people shop for their homes, Wayfair is the place for you. Please review our Candidate Privacy Notice for information on how your personal data is processed. If you have any questions or wish to exercise your privacy rights, please contact us at dataprotectionofficer@wayfair.com.,

Posted 1 week ago

Apply

0.0 - 5.0 years

0 - 1 Lacs

Asansol

Work from Office

Swasthya sathi data entries, uploading and resolving queries. MIS report preparation. Billing. Advanced Excel. Office coordination.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

Your role at Prudential is to design, build, and maintain data pipelines to ingest data from multiple sources into the cloud data platform. It is essential to ensure that these pipelines are constructed according to defined standards and documented comprehensively. Data governance standards must be adhered to and enforced to maintain data integrity and compliance. Additionally, you will be responsible for implementing data quality rules to ensure the accuracy and reliability of the data. As part of your responsibilities, you will need to implement data security and protection controls around Databricks Unity Catalog. You will be utilizing Azure Data Factory, Azure Databricks, and other Azure services to build and optimize data pipelines. Proficiency in SQL, Python/PySpark, and other programming languages for data processing and transformation is crucial. Staying updated with the latest Azure technologies and best practices is essential for this role. You will also provide technical guidance and support to team members and stakeholders. Detailed documentation of data pipelines, processes, and data quality rules must be maintained. Debugging, fine-tuning, and optimizing large-scale data processing jobs will be part of your routine tasks. Generating reports and dashboards to monitor data pipeline performance and data quality metrics is also important. Collaboration with data teams across Asia and Africa to understand data requirements and deliver solutions will be required in this role. Overall, your role at Prudential will involve designing, building, and maintaining data pipelines, ensuring data integrity, implementing data quality rules, and collaborating with various teams to deliver effective data solutions.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 - 0 Lacs

kochi, kerala

On-site

You have a few open positions for Service Executive at Cochin. To be eligible, you should have a minimum of 2 years of experience in Customer Service, Data Processing, or Data Entry. The required qualifications include BA, B.Com, B.Sc., BBA, BBM, or BCA. It is essential to possess good communication and interpersonal skills, along with proficient English and regional language abilities. You should be adept at handling customers and have good keyboard skills with accuracy. Full-time courses only will be considered, and there should not be more than a 2-year gap during education or employment. This is a non-technical requirement, so candidates from technical backgrounds such as BE, B.Tech, M.Tech, and MCA need not apply. The salary offered for this position ranges from 2.5Lacs to 2.8Lacs per annum. The job type is full-time. Benefits include health insurance, leave encashment, paid sick time, and Provident Fund. The working schedule is during the day shift. Please answer the following questions in your application: 1. Do you have both PAN & Aadhar (Yes/No/PAN Only/Aadhar Only) 2. Do you have all your educational certificates ready (SSC, HSC, Graduation - Provisional, original) Yes/No/Only Provisional 3. Do you have a minimum of 2 years of experience in either Customer Service, Data Entry, or Data Processing Yes/No 4. Do you have a gap in education and experience If yes, how many years If no, please specify. The work location is in person.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

ghaziabad, uttar pradesh

On-site

As a skilled Python Backend Engineer at Cognio Labs, you will be responsible for leveraging your expertise in FastAPI and your strong foundation in Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) technologies. Your role will involve a blend of backend development and data science to facilitate data processing for model fine-tuning and training. You should have a minimum of 2 years of experience in Python backend development and possess the ability to develop and maintain APIs using the FastAPI framework. Proficiency in asynchronous programming, background task implementation, and database management using both SQL and NoSQL databases, especially MongoDB, are essential. Additionally, familiarity with Git version control systems and RESTful API design and implementation is required. Experience with containerization technologies like Docker, understanding of component-based architecture principles, and the capability to write clean, maintainable, and testable code are valuable additional technical skills. Knowledge of testing frameworks, quality assurance practices, and AI technologies such as LangChain, ChatGPT endpoints, and other LLM frameworks will be advantageous. In the realm of AI and Data Science, your experience with LLMs and RAG implementation will be highly valued. You should be adept at data processing for fine-tuning language models, manipulating and analyzing data using Python libraries such as Pandas and NumPy, and implementing machine learning workflows efficiently. Your key responsibilities will include designing, developing, and maintaining robust, scalable APIs using the FastAPI framework, preparing data for model fine-tuning and training, implementing background tasks and asynchronous processing for system optimization, integrating LLM and RAG-based solutions into the product ecosystem, and following industry best practices to write efficient, maintainable code. Collaboration with team members, database design and implementation, troubleshooting and debugging codebase issues, as well as staying updated on emerging technologies in Python development, LLMs, and data science will be integral parts of your role at Cognio Labs.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

coimbatore, tamil nadu

On-site

The role aims to provide analysis aligned with client business objectives and goals while exceeding client performance targets. As a Data Processing professional, you will need to have at least 2 years of experience working with Q & SPSS (preferred), Dimensions, SPSS syntax, or similar data processing software. This will involve handling data for large, multi-market complex tracker projects and executing tasks such as weighting, stacking, and sig testing while adhering to industry best practices. Your responsibilities will include conducting thorough quality checks and data validation to ensure accuracy. You will collaborate with internal project managers and client services team members to finalize materials like data processing spec forms, offering guidance on tool functionality and solutions. It is essential to develop and maintain data processing workflows and documentation for efficient operations. Proficiency in Microsoft Excel is required, with experience or knowledge of VBA macro writing considered a plus. The role may involve working night shifts on a rotational basis, providing 24/7 operational support and being available on weekends as per the roster. As a client-focused professional, you should possess strong consulting, communication, and collaboration skills. Additionally, being emotionally intelligent, adept at conflict resolution, and thriving in high-pressure, fast-paced environments are crucial. Demonstrating ownership, problem-solving abilities, and effective multitasking and prioritization skills are highly valued in this role. Location: DGS India - Coimbatore - KGISL Tech Park Brand: Merkle Time Type: Full time Contract Type: Permanent,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

The Data Processing Manager will be a key member of the newly formed India-based DMD Hub, supporting quantitative healthcare market research projects across global markets. This role is responsible for the accurate and timely production of tabulations, both interim and final, and requires close collaboration with research, project management, and scripting teams. You will be running and validating interim and final tabulations for quantitative research studies using relevant software such as Q, Quantum, SPSS, Excel-based platforms, or proprietary tools. It is crucial to ensure all tabulations meet internal and client-specific formatting, logic, and output requirements. You will be responsible for checking tabulated data for accuracy, completeness, and consistency before delivery. Additionally, you will support QA processes and documentation to maintain high data standards across projects. Your role will involve assisting in preparing raw datasets for processing, including basic cleaning, formatting, and consistency checks. You will work with coded and uncoded datasets, ensuring proper integration into final tabulation outputs. Collaboration with the scripting/programming team is essential to ensure accurate data structure for tabulation. It is important to work closely with the research and project management teams to understand project requirements, timelines, and deliverables. Your input and feedback to internal stakeholders will be valuable in optimizing processes and outputs. Managing assigned tasks within deadlines and proactively flagging any delays or data issues to the manager and wider team will be part of your responsibilities. Maintaining clear documentation of tabulation processes and version control is necessary. You will support in updating project logs and workflow tracking systems. Continuous improvement of knowledge of data processing tools, tabulation best practices, and healthcare market research processes is expected. Furthermore, you will provide training, mentorship, and professional development to the DP team. Skills and Experience: - More than 8 years of experience in data processing within a market research agency, preferably in healthcare but other market research sectors will also be considered. - Strong proficiency in data tabulation software, preferably Q, but experience with Quantum or similar tools will be recognized. - Exposure to or understanding of survey scripting tools and survey data structures is advantageous. - Knowledge of data cleaning, validation, coding (open-ends), and cross-tabulation methodologies. - Experience working with survey data formats (e.g., XML, CSV), as well as knowledge of relational databases and data structure management. - Understanding of quantitative research methodologies, questionnaire structures, and healthcare market research best practices. - Experience with data mapping, data transformations, and structuring for reporting purposes. - Problem-solving skills, particularly in situations requiring analytical judgment and establishing best practice solutions. - Skilled in handling complex datasets, multiple data sources, and large volumes of data efficiently. - Strong communication skills, particularly in explaining technical points to non-technical individuals in an international environment. - Ability to identify process improvement opportunities and implement SOPs for efficiency. - High attention to detail with a strong focus on data accuracy and consistency. - Logical thinking and ability to troubleshoot errors or discrepancies in tabulated data. - Ability to interpret tabulation specifications and apply them accurately. Our company values diversity and inclusivity. If you are excited about this role but do not meet every job requirement, we encourage you to apply. Your unique experience and perspective may make you the perfect fit for this position or other opportunities within our organization.,

Posted 1 week ago

Apply

2.0 - 11.0 years

25 - 30 Lacs

Pune

Work from Office

HSBC electronic data processing india pvt ltd is looking for Senior Software Engineer to join our dynamic team and embark on a rewarding career journey Developing and directing software system validation and testing methods. Directing our software programming initiatives Overseeing the development of documentation. Working closely with clients and cross-functional departments to communicate project statuses and proposals. Analyzing data to effectively coordinate the installation of new systems or the modification of existing systems. Managing the software development lifecycle. Monitoring system performance. Communicating key project data to team members and building cohesion among teams. Developing and executing project plans. Applying mathematics and statistics to problem-solving initiatives. Applying best practices and standard operating procedures. Creating innovative solutions to meet our companys technical needs. Testing new software and fixing bugs. Shaping the future of our systems.

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Manager, Data Visualization The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the worlds greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview: A unique opportunity to be part of an Insight & Analytics Data hub for a leading biopharmaceutical company and define a culture that creates a compelling customer experience. Bring your entrepreneurial curiosity and learning spirit into a career of purpose, personal growth, and leadership. We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the worlds greatest health threats As a manager in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Our Quantitative Sciences team use big data to analyze the safety and efficacy claims of our potential medical breakthroughs. We review the quality and reliability of clinical studies using deep scientific knowledge, statistical analysis, and high-quality data to support decision-making in clinical trials. What will you do in this role: Design & develop user-centric data visualization solutions utilizing complex data sources. Identify & define key business metrics and KPIs in partnership with business stakeholders. Define & develop scalable data models in alignment & support from data engineering & IT teams. Lead UI UX workshops to develop user stories, wireframes & develop intuitive visualizations. Collaborate with data engineering, data science & IT teams to deliver business friendly dashboard & reporting solutions. Apply best practices in data visualization design & continuously improve upon intuitive user experience for business stakeholders. Provide thought leadership and data visualization best practices to the broader Data & Analytics organization. Identify opportunities to apply data visualization technologies to streamline & enhance manual / legacy reporting deliveries. Provide training & coaching to internal stakeholders to enable a self-service operating model. Co-create information governance & apply data privacy best practices to solutions. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace. What Should you have: 5 years relevant experience in data visualization, infographics, and interactive visual storytelling Working experience and knowledge in Power BI / QLIK / Spotfire / Tableau and other data visualization technologies Working experience and knowledge in ETL process, data modeling techniques & platforms (Alteryx, Informatica, Dataiku, etc.) Experience working with Database technologies (Redshift, Oracle, Snowflake, etc) & data processing languages (SQL, Python, R, etc.) Experience in leveraging and managing third party vendors and contractors. Self-motivation, proactivity, and ability to work independently with minimum direction. Excellent interpersonal and communication skills Excellent organizational skills, with ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate and lead with diverse groups of work colleagues and positively manage ambiguity. Experience in Pharma and or Biotech Industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are: What we look for: #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business Intelligence (BI), Clinical Decision Support (CDS), Clinical Testing, Communication, Create User Stories, Data Visualization, Digital Transformation, Healthcare Innovation, Information Technology Operations, IT Operation, Management Process, Marketing, Motivation Management, Requirements Management, Self Motivation, Statistical Analysis, Statistics, Thought Leadership, User Experience (UX) Design Preferred Skills: Job Posting End Date: 07/31/2025 *A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date.

Posted 1 week ago

Apply

10.0 - 18.0 years

50 - 55 Lacs

Pune

Work from Office

As a Manager, Software Engineering with a focus on Data Engineering and Data Science, you will play a pivotal role in shaping and growing our Authentication Program. Your primary responsibility will be to ensure that our data is of the highest quality and that our teams can work effectively. You will be leading a team of talented engineers, driving innovation, and ensuring the successful delivery of high-impact projects. Responsibilities Lead the development and implementation of scalable data engineering and data science solutions to support the Authentication Program. Ensure the quality, accuracy, and reliability of data across all systems and processes. Mentor and provide technical guidance to a team of engineers and data scientists. Drive strategic decisions and deliver innovative solutions in collaboration with cross-functional teams. Collaborate with product stakeholders to prioritize initiatives and align with business goals. Develop and maintain data pipelines, ensuring data is processed efficiently and accurately. Implement and promote best practices in data engineering, data science, and software development. Automate and streamline data processing workflows and development processes. Conduct Proof of Concepts (POCs) to evaluate and introduce new technologies. Participate in Agile ceremonies and contribute to team prioritization and planning. Develop and present roadmaps and proposals to Senior Management and stakeholders. Foster a culture of continuous improvement and excellence within the team. Qualifications Technical Expertise Experience in Data Engineering, Data Science, or related fields. Proficiency in programming languages such as Python, Java, or Scala. Hands-on experience with data processing frameworks. Experience with data warehousing solutions. Solid understanding of data modeling, ETL processes, and data pipeline orchestration. Familiarity with machine learning frameworks and libraries. Knowledge of secure coding practices and data privacy regulations. Leadership and Communication Proven experience in leading and managing technical teams. Strong problem-solving and decision-making skills. Excellent written and verbal communication skills. Ability to work effectively with cross-functional teams and stakeholders. Experience in Agile methodologies and project management. Preferred Qualifications Degree in Computer Science, Data Science, Engineering, or a related field. Experience with streaming data platforms. Familiarity with data visualization tools. Experience with CI/CD pipelines and DevOps practices.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Job Summary: We are seeking a highly experienced and innovative AI/ML Engineer to lead the design, development, and deployment of scalable machine learning solutions. The ideal candidate will have deep expertise in Python, Apache Spark, MLOps, and cloud platforms (GCP and AWS), along with experience in distributed query engines like Trino. You will play a key role in building intelligent systems that drive business value through data science and machine learning. Job Summary: We are seeking a highly experienced and innovative AI/ML Engineer to lead the design, development, and deployment of scalable machine learning solutions. The ideal candidate will have deep expertise in Python , Apache Spark , MLOps , and cloud platforms (GCP and AWS), along with experience in distributed query engines like Trino . You will play a key role in building intelligent systems that drive business value through data science and machine learning. Key Responsibilities: Design and implement scalable ML pipelines using Spark , Python , and MLOps best practices. Develop, train, and deploy machine learning models in production environments. Collaborate with data scientists, data engineers, and product teams to translate business problems into ML solutions. Optimize model performance and ensure reproducibility, versioning, and monitoring using MLOps tools and frameworks. Work with Trino and other distributed query engines for efficient data access and feature engineering. Deploy and manage ML workloads on GCP and AWS , leveraging services like SageMaker, Vertex AI, BigQuery, and EMR. Implement CI/CD pipelines for ML workflows and ensure compliance with data governance and security standards. Mentor junior engineers and contribute to the development of best practices and technical standards. Required Skills: Strong programming skills in Python with experience in ML libraries (e.g., scikit-learn, TensorFlow, PyTorch). Expertise in Apache Spark for large-scale data processing. Solid understanding of MLOps practices including model versioning, monitoring, and deployment. Experience with Trino or similar distributed SQL engines (e.g., Presto, Hive). Hands-on experience with GCP (e.g., Vertex AI, BigQuery) and AWS (e.g., SageMaker, S3, Lambda). Familiarity with containerization (Docker) and orchestration (Kubernetes). Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Preferred Qualifications: Master s or PhD in Computer Science, Data Science, or a related field. Experience with feature stores, model registries, and ML observability tools. Knowledge of data privacy, security, and compliance in ML systems. Contributions to open-source ML or data engineering projects. Impact Youll Make: NA This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Sr Developer, Applications Development

Posted 1 week ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

;:" Your responsibilities The Senior Data Scientist will lead the development of data-driven solutions by leveraging traditional data science techniques and recent advancements in Generative AI to bring value to ADM. The role is integral to the Digital & Innovation team, driving rapid prototyping efforts, collaborating with cross-functional teams, and developing innovative approaches to solve business problems. This position requires a blend of expertise in traditional machine learning models, data science practices, and emerging AI technologies to create value and improve business outcomes. Key Responsibilities: Lead end-to-end machine learning projects, from data exploration, modeling, and deployment, ensuring alignment with business objectives. Utilize traditional AI/data science methods (e.g., regression, classification, clustering) and advanced AI methods (e.g., neural networks, NLP) to address business problems and optimize processes. Implement and experiment with Generative AI models based on business needs using Prompt Engineering, Retrieval Augmented Generation (RAG) or Finetuning, using LLM\u0027s, LVM\u0027s, TTS etc. Collaborate with teams across Digital & Innovation, business stakeholders, software engineers, and product teams, to rapidly prototype and iterate on new models and solutions. Mentor and coach junior data scientists and analysts, fostering an environment of continuous learning and collaboration. Adapt quickly to new AI advancements and technologies, continuously learning and applying emerging methodologies to solve complex problems. Work closely with other teams (e.g., Cybersecurity, Cloud Engineering) to ensure the successful integration of models into production systems. Ensure models meet rigorous performance, accuracy, and efficiency standards, performing cross-validation, tuning, and statistical checks. Communicate results and insights effectively to both technical and non-technical stakeholders, delivering clear recommendations for business impact. Ensure adherence to data privacy, security policies, and governance standards across all data science initiatives.

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Chennai

Work from Office

Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? This role will be part of the Treasury Applications Platform team. We are currently modernizing our platform, migrating it to GCP. You will contribute towards making the platform more resilient and secure for future regulatory requirements and ensuring compliance and adherence to Federal Regulations. Preferably a BS or MS degree in computer science, computer engineering, or other technical discipline 10+ years of software development experience Ability to effectively interpret technical and business objectives and challenges and articulate solutions Willingness to learn new technologies and exploit them to their optimal potential Strong experience Finance, Controllership, Treasury Applications Strong background with Java, Python, Pyspark, SQL, Concurrency/parallelism, oracle, big data, in-memory computing platforms Cloud experience with GCP would be a preference Conduct IT requirements gathering. Define problems and provide solution alternatives. Solution Architecture and system design. Create detailed system design documentation. Implement deployment plans. Understand business requirements with the objective of providing high-quality IT solutions. Support team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation, design and deployment. Under supervision participate in unit-level and organizational initiatives with the objective of providing high-quality and value adding consulting solutions. Troubleshoot issues, diagnose problems, and conduct root-cause analysis. Perform secondary research as instructed by supervisor to assist in strategy and business planning. Minimum Qualifications: Strong experience with Cloud architecture Deep understanding of SDLC, OOAD, CI/CD, Containerization, Agile, Java, PL/SQL Preferred Qualifications: GCP Big data processing systems Finance Treasury Cash Management Kotlin experience Kafka Open Telemetry Network We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally:

Posted 1 week ago

Apply

7.0 - 12.0 years

25 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Building off our Cloud momentum, Oracle has formed a new organization - Health Data Intelligence. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. Oracle Health Data Analytics has a rare opportunity to play a critical role in how Oracle Health products impact and disrupt the healthcare industry by transforming how healthcare and technology intersect. As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems. Design and build distributed, scalable, and fault-tolerant software systems. Build cloud services on top of the modern OCI infrastructure. Participate in the entire software lifecycle, from design to development, to quality assurance, and to production. Invest in the best engineering and operational practices upfront to ensure our software quality bar is high. Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies. Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software. Qualifications 7+ years of experience in the software industry working on design, development and delivery of highly scalable products and services. Understanding of the entire product development lifecycle that includes understanding and refining the technical specifications, HLD and LLD of world-class products and services, refining the architecture by providing feedback and suggestions, developing, and reviewing code, driving DevOps, managing releases and operations. Strong knowledge of Java or JVM based languages. Experience with multi-threading and parallel processing. Strong knowledge of big data technologies like Spark, Hadoop Map Reduce, Crunch, etc. Past experience of building scalable, performant, and secure services/modules. Understanding of Micro Services architecture and API design Experience with Container platforms Good understanding of testing methodologies. Experience with CI/CD technologies. Experience with observability tools like Spunk, New Relic, etc Good understanding of versioning tools like Git/SVN.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Job Title: Data/AI Engineer GenAI & Agentic AI Integration (Azure) Location: Bangalore, India Job Type: Full-Time About the Role We are seeking a highly skilled Data/AI Engineer to join our dynamic team, specializing in integrating cutting-edge Generative AI (GenAI) and Agentic AI solutions within the Azure cloud environment. The ideal candidate will have a strong background in Python, data engineering, and AI model integration, with hands-on experience working on Databricks, Snowflake, Azure Storage, and Palantir platforms. You will play a crucial role in designing, developing, and deploying scalable data and AI pipelines that power next-generation intelligent applications. Key Responsibilities Design, develop, and maintain robust data pipelines and AI integration solutions using Python on Azure Databricks. Integrate Generative AI and Agentic AI models into existing and new workflows to drive business innovation and automation. Collaborate with data scientists, AI researchers, software engineers, and product teams to deliver scalable and efficient AI-powered solutions. Orchestrate data movement and transformation across Azure-native services including Azure Databricks, Azure Storage (Blob, Data Lake), and Snowflake, ensuring data quality, security, and compliance. Integrate enterprise data using Palantir Foundry and leverage Azure services for end-to-end solutions. Develop and implement APIs and services to facilitate seamless AI model deployment and integration. Optimize data workflows for performance and scalability within Azure. Monitor, troubleshoot, and resolve issues related to data and AI pipeline performance. Document architecture, designs, and processes for knowledge sharing and operational excellence. Stay current with advances in GenAI, Agentic AI, Azure data engineering best practices, and cloud technologies. Required Qualifications Bachelor s or Master s degree in Computer Science, Engineering, Data Science, or a related field (or equivalent practical experience). 5+ years of professional experience in data engineering or AI engineering roles. Strong proficiency in Python for data processing, automation, and AI model integration. Hands-on experience with Azure Databricks and Spark for large-scale data engineering. Proficiency in working with Snowflake for cloud data warehousing. In-depth experience with Azure Storage solutions (Blob, Data Lake) for data ingestion and management. Familiarity with Palantir Foundry or similar enterprise data integration platforms. Demonstrated experience integrating and deploying GenAI or Agentic AI models in production environments. Knowledge of API development and integration for AI and data services. Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Excellent communication and documentation skills. Preferred Qualifications Experience with Azure Machine Learning, Azure Synapse Analytics, and other Azure AI/data services. Experience with MLOps, model monitoring, and automated deployment pipelines in Azure. Exposure to data governance, privacy, and security best practices. Experience with visualization tools and dashboard development. Knowledge of advanced AI model architectures, including LLMs and agent-based systems. #DataEngineer Job ID R-75732 Date posted 07/24/2025

Posted 1 week ago

Apply

10.0 - 15.0 years

14 - 19 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Explain complex technologies in simple terms to clients, peers, and management Work in tandem with our engineering team to identify and implement the most optimal cloud-based solutions for the company Identifying appropriate cloud services to support applications on the cloud Define and document best practices and strategies regarding application deployment in cloud and its maintenance Provide guidance, thought leadership, and mentorship to developer teams to build their cloud competencies Ensure cloud environments are in accordance with company security guidelines Orchestrating and automating cloud-based platforms throughout the company Analyze the usage of cloud services and implementing cost-saving strategies Analyze cloud hosted application performance, uptime, scalability and \u00A0maintaining high standards for code quality and thoughtful design Stay current with industry trends, making recommendations as needed to help the organization innovate and excel. Working with Cloud Transformation team to build migration strategies for various migration use cases.\u00A0 \u00A0 Qualification & Experience \u00A0 Having a basic understanding or exposure to AI tools would be a plus. \u00A0 Expert in Cloud networking. Expertise in connectivity with cloud and On prem data centres. Expert in Global routing, DNS and Network Segregation capabilities At least 10+ Years\u2019 Experience in IT /Cloud Infrastructure (Architect. /SME /Lead) 5+ Years\u2019 Experience in (Cloud Technologies) AWS Certified Solution Architect - Associate Azure Certified CKA \u2013 Good to have \u00A0 Strong experience in working with Azure and AWS cloud services Strong experience in working with EKS and AKS Strong experience in working with Cloud monitoring, logging, resource management and cost controls\u00A0 Cloud Database experience, including knowledge of SQL and NoSQL, and related data stores such as Postgres. Strong awareness of networking concept including best practices for Cloud connectivity, TCP/IP, DNS, SMTP, HTTP and distributed networks. Strong understanding & experience in setting-up highly resilient Application\u00A0 Strong understanding & experience on DR design and setup Strong awareness of cloud security concepts and services Strong awareness of various cloud Migration phases and 6R migration approaches Good analytics skill to Identifying potential bottlenecks in applications\u2019 performance Maintaining data integrity by implementing proper access control for cloud services Functional / Domain (e.g. Underwriting, Claims Mgmt.) \u00A0 Understanding and experience with the all the pillars of a well-architected framework Experience in IT Infrastructure Automation and Infrastructure as Code e.g. ANSIBLE, TERRAFORM etc., Knowledge on technology trends and best practices Experience in the use of Architectural Design Software \u2013 MS Visio, ArchiMate etc., Experience in working in a highly diverse, multi-national, multi-cultural environment Main tasks (any special / short term tasks that are occasional) At the direction of lead architects, develop and implement technical efforts to design, build, and deploy cloud applications, including large-scale data processing, computationally intensive statistical modeling, and advanced analytics Participate in all aspects of the software development lifecycle for cloud solutions, including planning, requirements, development, testing, and quality assurance Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures Educate teams on the implementation of new cloud-based initiatives, providing associated training when necessary Demonstrate exceptional problem-solving skills, with an ability to see and solve issues before they affect business productivity Serves as technology expert on delivering Technical support service Works on all stages of the product life cycle from requirements through design, implementation, and into support. Helps client set the companys strategic technology direction based on experience and real-time input from users. Evaluates technologies to determine strengths and weaknesses in architecture, implementation, and suitability. Makes recommendations consistent with the vision of the business area/enterprise

Posted 1 week ago

Apply

7.0 - 12.0 years

35 - 40 Lacs

Chennai

Work from Office

Your work days are brighter here. About the Team Workday Prism Analytics is a self-service analytics solution for Finance and Human Resources teams that allows companies to bring external data into Workday, combine it with existing people or financial data, and present it via Workday s reporting framework. This gives the end user a comprehensive collection of insights that can be carried out in a flash. We design, build and maintain the data warehousing systems that underpin our Analytics products. We straddle both applications and systems, the ideal candidate for this role is someone who has a passion for solving hyper scale engineering challenges to serve the largest companies on the planet. About the Role As part of Workday s Prism Analytics team, you will be responsible for the integration of our Big Data Analytics stack with Workdays cloud infrastructure. You will work on building, improving and extending large-scale distributed data processing frameworks like Spark, Hadoop, and YARN in a multi-tenanted cloud environment. You will also be responsible for developing techniques for cluster management, high availability and disaster recovery of the Analytics Platform, Hadoop and Spark services. You will also engineer smart tools and frameworks that provide easy monitoring, troubleshooting, and manageability of our cloud-based analytic services. About You You are an engineer who is passionate about developing distributed applications in a multi-tenanted cloud environment. You take pride in developing distributed systems techniques to coordinate application services, ensuring the application remains highly available and working on disaster recovery for the applications in the cloud. You think not only about what is valuable for the development of right abstractions and modules but also about programmatic interfaces to enable customer success. You also excel in the ability to balance priorities and make the right tradeoffs in feature content and timely delivery of features while ensuring customer success and technology leadership for the company. You can make all of this happen using Java, Spark, and related Hadoop technologies. Basic Qualifications 7+ years of software engineering experience. At least 5+ years of software development experience (using Java, Scala or other languages) with deep Linux/Unix expertise. Other Qualifications Experience in building Highly Available, Scalable, Reliable multi-tenanted big data applications on Cloud (AWS, GCP) and/or Data Center architectures. Working knowledge of distributed system principles. Experience with managing big data frameworks like Spark and/or Hadoop. Understanding of resource management using YARN, Kubernetes, etc. Pursuant to applicable Fair Chance law, Workday will consider for employment qualified applicants with arrest and conviction records. Workday is an Equal Opportunity Employer including individuals with disabilities and protected veterans. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 1 week ago

Apply

4.0 - 9.0 years

35 - 40 Lacs

Chennai

Work from Office

Your work days are brighter here. About the Team Workday Prism Analytics is a self-service analytics solution for Finance and Human Resources teams that allows companies to bring external data into Workday, combine it with existing people or financial data, and present it via Workday s reporting framework. This gives the end user a comprehensive collection of insights that can be carried out in a flash. We design, build and maintain the data warehousing systems that underpin our Analytics products. We straddle both applications and systems, the ideal candidate for this role is someone who has a passion for solving hyper scale engineering challenges to serve the largest companies on the planet. About the Role As part of Workday s Prism Analytics team, you will be responsible for the integration of our Big Data Analytics stack with Workdays cloud infrastructure. You will work on building, improving and extending large-scale distributed data processing frameworks like Spark, Hadoop, and YARN in a multi-tenanted cloud environment. You will also be responsible for developing techniques for cluster management, high availability and disaster recovery of the Analytics Platform, Hadoop and Spark services. You will also engineer smart tools and frameworks that provide easy monitoring, troubleshooting, and manageability of our cloud-based analytic services. About You You are an engineer who is passionate about developing distributed applications in a multi-tenanted cloud environment. You take pride in developing distributed systems techniques to coordinate application services, ensuring the application remains highly available and working on disaster recovery for the applications in the cloud. You think not only about what is valuable for the development of right abstractions and modules but also about programmatic interfaces to enable customer success. You also excel in the ability to balance priorities and make the right tradeoffs in feature content and timely delivery of features while ensuring customer success and technology leadership for the company. You can make all of this happen using Java, Spark, and related Hadoop technologies. Basic Qualifications At least 4+ years of software development experience (using Java, Scala or other languages) with deep Linux/Unix expertise. Other Qualifications Experience in building Highly Available, Scalable, Reliable multi-tenanted big data applications on Cloud (AWS, GCP) and/or Data Center architectures. Working knowledge of distributed system principles. Understanding of big data frameworks like Spark and/or Hadoop. Understanding of resource management using YARN, Kubernetes, etc. Pursuant to applicable Fair Chance law, Workday will consider for employment qualified applicants with arrest and conviction records. Workday is an Equal Opportunity Employer including individuals with disabilities and protected veterans. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 1 week ago

Apply

3.0 - 4.0 years

25 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Precisely is the leader in data integrity. We empower businesses to make more confident decisions based on trusted data through a unique combination of software, data enrichment products and strategic services. What does this mean to you? For starters, it means joining a company focused on delivering outstanding innovation and support that helps customers increase revenue, lower costs and reduce risk. In fact, Precisely powers better decisions for more than 12,000 global organizations, including 93 of the Fortune 100. Preciselys 2500 employees are unified by four company core values that are central to who we are and how we operate: Openness, Determination, Individuality, and Collaboration. We are committed to career development for our employees and offer opportunities for growth, learning and building community. With a "work from anywhere" culture, we celebrate diversity in a distributed environment with a presence in 30 countries as well as 20 offices in over 5 continents. Learn more about why its an exciting time to join Precisely! Overview: As a Data Engineer I, this role combines expertise in demography with strong programming skills to create tools that help organizations understand population trends, social dynamics, and economic factors. The ideal candidate will work closely with data scientists, statisticians, and policy analysts to design, develop, and maintain applications that process large datasets, generate reports, and visualize demographic patterns. Responsibilities include coding, testing, and deploying software modules, integrating demographic models, and ensuring data accuracy and security. What you will do: Demographic professional with 3 to 4 years of industry experience, involved in the of Demographic/Data Engineering solutions Develop and maintain software applications for demographic data analysis Collaborate with data scientists and demographers to integrate demographic models Design data processing pipelines to handle large and complex datasets Create visualizations and reports to communicate demographic insights Ensure data accuracy, security, and compliance with privacy regulations Test and debug software to maintain high-quality standards Stay updated with demographic research and technological trends Optimize software performance and scalability Document code and development processes clearly Provide technical support and training to end-users Write clear, compelling, and detailed (technical) user epics and stories with user acceptance criteria. Participate in story grooming exercises for crisp and unambiguous documentation and communication of features to be developed. Collaborate with other team members, also work with cross-functional teams according to requirements. Peer review of code practice needs to be followed. Evaluate, learn, and incorporate new technologies into new and existing frameworks and solutions as applicable. Be agile and embrace change. What we are looking for: 3+ years of industry experience in the areas of Demographic Role Bachelor s or Master s degree in Demography, Geospatial /Geography, Computer Science, Statistics, or related field Proficiency in programming languages such as Python, R Experience with database management and data querying (SQL) Strong understanding of demographic concepts and population data Familiarity with data visualization tools and libraries (e.g., Tableau) Experience in Cloud technologies like AWS, azure etc. Excellent knowledge of database concepts and complex query writing Excellent knowledge in Query Optimization for better performance Exposure to Geo-spatial domain and how geospatial data is stored in database is preferred Ability to communicate with various stakeholders at all levels of the organization. Excellent verbal and written communication skills Excellent interpersonal skills and active listener Able to set and meet time-sensitive goals Able to handle multiple tasks simultaneously and adapt to change while providing structure to operations and go-to-market teams #LI-SR1 The personal data that you provide as a part of this job application will be handled in accordance with relevant laws. For more information about how Precisely handles the personal data of job applicants, please see the Precisely Global Applicant and Candidate Privacy Notice .

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Job title: Senior Software Engineer Experience: 5- 8 years Primary skills: Python, Spark or Pyspark, DWH ETL. Database: SparkSQL or PostgreSQL Secondary skills: Databricks ( Delta Lake, Delta tables, Unity Catalog) Work Model: Hybrid (Weekly Twice) Cab Facility: Yes Work Timings: 10am to 7pm Interview Process: 3 rounds (3rd round F2F Mandatory) Work Location: Karle Town Tech Park Nagawara, Hebbal Bengaluru 560045 About Business Unit: The Architecture Team plays a pivotal role in the end-to-end design, governance, and strategic direction of product development within Epsilon People Cloud (EPC). As a centre of technical excellence, the team ensures that every product feature is engineered to meet the highest standards of scalability, security, performance, and maintainability. Their responsibilities span across architectural ownership of critical product features, driving techno-product leadership, enforcing architectural governance, and ensuring systems are built with scalability, security, and compliance in mind. They design multi cloud and hybrid cloud solutions that support seamless integration across diverse environments and contribute significantly to interoperability between EPC products and the broader enterprise ecosystem. The team fosters innovation and technical leadership while actively collaborating with key partners to align technology decisions with business goals. Through this, the Architecture Team ensures the delivery of future-ready, enterprise-grade, efficient and performant, secure and resilient platforms that form the backbone of Epsilon People Cloud. Why we are looking for you: You have experience working as a Data Engineer with strong database fundamentals and ETL background. You have experience working in a Data warehouse environment and dealing with data volume in terabytes and above. You have experience working in relation data systems, preferably PostgreSQL and SparkSQL. You have excellent designing and coding skills and can mentor a junior engineer in the team. You have excellent written and verbal communication skills. You are experienced and comfortable working with global clients You work well with teams and are able to work with multiple collaborators including clients, vendors and delivery teams. You are proficient with bug tracking and test management toolsets to support development processes such as CI/CD. What you will enjoy in this role: As part of the Epsilon Technology practice, the pace of the work matches the fast-evolving demands in the industry. You will get to work on the latest tools and technology and deal with data of petabyte-scale. Work on homegrown frameworks on Spark and Airflow etc. Exposure to Digital Marketing Domain where Epsilon is a marker leader. Understand and work closely with consumer data across different segments that will eventually provide insights into consumer behaviour's and patterns to design digital Ad strategies. As part of the dynamic team, you will have opportunities to innovate and put your recommendations forward. Using existing standard methodologies and defining as per evolving industry standards. Opportunity to work with Business, System and Delivery to build a solid foundation on Digital Marketing Domain. The open and transparent environment that values innovation and efficiency Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What will you do? Develop a deep understanding of the business context under which your team operates and present feature recommendations in an agile working environment. Lead, design and code solutions on and off database for ensuring application access to enable data-driven decision making for the company's multi-faceted ad serving operations. Working closely with Engineering resources across the globe to ensure enterprise data warehouse solutions and assets are actionable, accessible and evolving in lockstep with the needs of the ever-changing business model. This role requires deep expertise in spark and strong proficiency in ETL, SQL, and modern data engineering practices. Design, develop, and manage ETL/ELT pipelines in Databricks using PySpark/SparkSQL, integrating various data sources to support business operations Lead in the areas of solution design, code development, quality assurance, data modelling, business intelligence. Mentor Junior engineers in the team. Stay abreast of developments in the data world in terms of governance, quality and performance optimization. Able to have effective client meetings, understand deliverables, and drive successful outcomes. Qualifications: Bachelor's Degree in Computer Science or equivalent degree is required. 5 - 8 years of data engineering experience with expertise using Apache Spark and Databases (preferably Databricks) in marketing technologies and data management, and technical understanding in these areas. Monitor and tune Databricks workloads to ensure high performance and scalability, adapting to business needs as required. Solid experience in Basic and Advanced SQL writing and tuning. Experience with Python Solid understanding of CI/CD practices with experience in Git for version control and integration for spark data projects. Good understanding of Disaster Recovery and Business Continuity solutions Experience with scheduling applications with complex interdependencies, preferably Airflow Good experience in working with geographically and culturally diverse teams. Understanding of data management concepts in both traditional relational databases and big data lakehouse solutions such as Apache Hive, AWS Glue or Databricks. Excellent written and verbal communication skills. Ability to handle complex products. Good communication and problem-solving skills, with the ability to manage multiple priorities. Ability to diagnose and solve problems quickly. Diligent, able to multi-task, prioritize and able to quickly change priorities. Good time management. Good to have knowledge of cloud platforms (cloud security) and familiarity with Terraform or other infrastructure-as-code tools. About Epsilon: Epsilon is a global data, technology and services company that powers the marketing and advertising ecosystem. For decades, we have provided marketers from the world's leading brands the data, technology and services they need to engage consumers with 1 View, 1 Vision and 1 Voice. 1 View of their universe of potential buyers. 1 Vision for engaging each individual. And 1 Voice to harmonize engagement across paid, owned and earned channels. Epsilon's comprehensive portfolio of capabilities across our suite of digital media, messaging and loyalty solutions bridge the divide between marketing and advertising technology. We process 400+ billion consumer actions each day using advanced AI and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon has been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Epsilon is a global company with more than 9,000 employees around the world.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Big Data Engineer at Barclays, where you will spearhead the evolution of the digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. To be successful as a Big Data Engineer, you should have experience with: - Full Stack Software Development for large-scale, mission-critical applications. - Mastery in distributed big data systems such as Spark, Hive, Kafka streaming, Hadoop, Airflow. - Expertise in Scala, Java, Python, J2EE technologies, Microservices, Spring, Hibernate, REST APIs. - Experience with n-tier web application development and frameworks like Spring Boot, Spring MVC, JPA, Hibernate. - Proficiency with version control systems, preferably Git; GitHub Copilot experience is a plus. - Proficient in API Development using SOAP or REST, JSON, and XML. - Experience developing back-end applications with multi-process and multi-threaded architectures. - Hands-on experience with building scalable microservices solutions using integration design patterns, Dockers, Containers, and Kubernetes. - Experience in DevOps practices like CI/CD, Test Automation, Build Automation using tools like Jenkins, Maven, Chef, Git, Docker. - Experience with data processing in cloud environments like Azure or AWS. - Data Product development experience is essential. - Experience in Agile development methodologies like SCRUM. - Result-oriented with strong analytical and problem-solving skills. - Excellent verbal and written communication and presentation skills. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. This role is for the Pune location. Purpose of the role: To design, develop, and improve software, utilizing various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities: - Development and delivery of high-quality software solutions by using industry-aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. - Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. - Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. - Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. - Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. - Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Analyst Expectations: - Perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. - Requires in-depth technical knowledge and experience in the assigned area of expertise. - Thorough understanding of the underlying principles and concepts within the area of expertise. - Lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. - If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. - For an individual contributor, develop technical expertise in the work area, acting as an advisor where appropriate. - Will have an impact on the work of related teams within the area. - Partner with other functions and business areas. - Take responsibility for end results of a team's operational processing and activities. - Escalate breaches of policies/procedure appropriately. - Take responsibility for embedding new policies/procedures adopted due to risk mitigation. - Advise and influence decision-making within the own area of expertise. - Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies