Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : Application Automation Engineer Project Role Description : Deliver predictive and intelligent delivery approaches based on automation and analytics. Drive the automation of delivery analytics to gather insights from data. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Automation Engineer, you will apply innovative ideas to drive the automation of Delivery Analytics at the client level. Your typical day will involve collaborating with cross-functional teams to identify opportunities for automation, analyzing existing processes, and implementing solutions that enhance efficiency and effectiveness. You will engage in discussions to share insights and contribute to the continuous improvement of analytics delivery, ensuring that the automation initiatives align with client needs and expectations. Your role will also require you to stay updated on industry trends and best practices to foster a culture of innovation within the team. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement automation strategies that enhance delivery analytics.- Collaborate with stakeholders to gather requirements and translate them into actionable automation solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data modeling concepts and best practices.- Experience with data extraction, transformation, and loading processes.- Familiarity with reporting tools and techniques to visualize data effectively.- Ability to troubleshoot and optimize existing data models for performance. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
7.0 - 11.0 years
7 - 11 Lacs
Chennai
Work from Office
About The Role Skill required: Com.Bkg- Commercial Real Estate - General Ledger Reconciliations Designation: Banking Advisory Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do We help clients address quality and productivity issues, keep pace with customer expectations, navigate regulatory and operational pressures and ensure business sustainability by transforming their banking operations into an agile, resilient operating model.The Corporate banking /Wholesale banking team is responsible for helping clients and organizations processes trade finance transactions by providing superior service delivery to trade customers whilst safeguarding the bank from risks associated with this business.Computes and presents the results of financial events by administering,managing and processing general ledger accounts. Accounts for current assets, fixed assets, liabilities, revenue and expense items, and gains and losses, and may handle related tax issues, invoice management and budgeting. What are we looking for Candidate will play a key supporting role in the lease budget/reforecasting process, assisting with coordination across departments, data tracking, and execution of leasing-related activities. This highly detail-oriented position requires strong organizational skills, a process-driven mindset, and the ability to manage and analyze large datasets with accuracy. The ideal candidate will be a proactive team player, comfortable working in a fast-paced environment, and capable of supporting cross-functional collaboration through clear communication and diligent follow-through.Experience with data visualization tools (e.g., Power BI) or data extraction tools (e.g., Looker) is a plusFamiliarity with real estate or leasing processes is a plus Roles and Responsibilities: Assist in managing the leasing process lifecycle, including tracking progress, updating supporting files and deliverables, and maintaining documentationCoordinate with internal teams (finance, asset management, research) to gather inputs and ensure timely delivery of leasing-related data and reportsMaintain and update leasing trackers, dashboards, and reporting tools to monitor completion progressAggregate and validate large datasets related to leasing activity, ensuring accuracy and completenessSupport the preparation of recurring and ad hoc reports for senior leadershipIdentify data discrepancies or process inefficiencies and escalate internally with recommended solutionsHelp develop and refine process documentation, templates, and checklists to improve operational efficiencyParticipate in cross-functional meetings and follow up on action items to ensure accountability and progress Qualification Any Graduation
Posted 4 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Participate in code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and methodologies.- Experience with ETL processes and tools for data extraction, transformation, and loading.- Familiarity with reporting tools and techniques to present data insights effectively.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with the latest technologies and methodologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with data extraction, transformation, and loading processes.- Familiarity with reporting tools and techniques to visualize data effectively.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with the latest technologies and methodologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with data extraction, transformation, and loading processes.- Familiarity with reporting tools and techniques for data visualization.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
2.0 - 5.0 years
3 - 8 Lacs
Chennai
Work from Office
Role & responsibilities Extract, clean, and analyze large data sets using SQL Design and build dashboards and reports using Power BI Knowledge on Python, DBT, GIT, Clickhouse, Postgres & Airflow is optional Translate business problems into data questions and analytics models. Partner with business stakeholders to identify KPIs and track performance metrics. Develop predictive and prescriptive models to support decision-making. Present findings and actionable insights to senior leadership. Ensure data quality, consistency, and governance standards are maintained. Mentor junior analysts or interns as needed. Preferred candidate profile Bachelors or Masters degree in Statistics, Data Science, Computer Science, Mathematics, Economics, or a related field. 2+ years of experience in data analytics, business intelligence, or data science roles. Strong proficiency in SQL, Excel, and data visualization tools (e.g., Power BI). Good understanding of databases, ETL pipelines, and cloud data environments (e.g., AWS, GCP, Azure). AI & ML knowledge will be an added advantage. Excellent communication and storytelling skills Strong problem-solving skills and attention to detail. Interested parties, please forward your full resume to career.india@emiratesline.com
Posted 4 days ago
1.0 - 5.0 years
8 - 14 Lacs
Hyderabad
Work from Office
What will I be doing? Solution and configure Zenoti to meet customers business processes Solution, design and setup proof of concepts and pilots for the high priority and large customers ensuring successful implementations. Extract, transform and load data across systems into Zenoti Identify significant risks, unknowns, and define and drive mitigation plans Address customer issues and concerns in a timely fashion to ensure customer satisfaction Stay up-to-date with product knowledge, business flow, sales process and market dynamic. Build expertise on data migration tools and techniques, legacy software data structures in order to improve the quality of customer onboarding experience Maintain complete documentation and follow organizational processes to ensure the successful implementation of Zenoti products. Ensure the adherence to SLAs, and key metrics according to the organizations goals and objectives Coordinate with internal teams as needed to meet customer needs and requirements, while managing customer expectations. What skills do I need? 1-5 years of experience in implementing software systems with hands-on experience in data transformation, system validation, and migration tasks. Deep knowledge of features in MS Excel, working knowledge of database systems a plus Ability to use tools/scripts to transform data for setting up customer sites Ability to innovate and develop tools to enhance the migration process Experience with data migrations and data mapping Good to have knowledge of Web Design using HTML, Ability to adhere to and develop quality checks to demonstrate the integrity of data migration from legacy systems into Zenoti A technology-centric background Strong logical, analytical, and problem-solving skills Excellent communication skills Can work in a fast-paced environment across multiple projects.
Posted 4 days ago
7.0 - 10.0 years
1 - 2 Lacs
Chennai
Hybrid
We are looking for a Senior Talend Developer with 7+ years of experience in designing and maintaining ETL workflows using Talend. The role involves data integration, pipeline optimization, and collaboration with BI and cloud teams. Strong SQL
Posted 4 days ago
4.0 - 9.0 years
18 - 20 Lacs
Hyderabad
Hybrid
Position Job Title: Business Intelligence Developer Reports To: Business Intelligence Manager Primary Purpose The BI developer applies business and advanced technical expertise in meeting business data and reporting needs. The position supports business planning by compiling, visualizing, and analyzing business and statistical data from UCWs information systems. The BI developer liaises with various stakeholders across the university to provide them with the data, reporting, and analysis required to make informed data-driven decisions. The Business Intelligence developer will work on projects that will have a significant impact on student, faculty, and staff experience. Specific Responsibilities The BI Developer will at various times be responsible for the following as well as other related duties as assigned to support the business objectives and purpose of the Company. Design relational databases to support business enterprise applications and physical data modeling according to business requirements Gather requirements from various business departments at UCW and transform them into self-serve reports/dashboards for the various business units using Power BI Understand ad-hoc data requirements and convert it into reporting deliverables Contribute to driving reporting automation and simplification to free up time for in-depth analyses Collaborate with internal and external team members, including system architects, software developers, database administrators, and design analysts, to find creative and innovative approaches to enrich business data Provide business and technical expertise for the analytics process, tools, and applications for the University. Identify opportunities that improve data accuracy and efficiency of our processes. Contributes to the development of training materials, documenting processes, and delivering sessions. Develop strategies for data modeling, design, transport, and implementation to meet requirements for metadata management, operational data stores, and ELT/ETL environments Create and test data models for a variety of business data, applications, database structures, and metadata tables to meet operational goals for performance and efficiency Research modern technologies, data modeling methods, and information management systems and recommend changes to company data architectures Contribute to a team environment where all team members consistently experience a sense of belonging and inclusion Position Requirements Competencies: Demonstrated experience in creating complex data models and developing insightful reports and dashboards using Microsoft Power BI Must possess advanced skills in using DAX queries for Power BI Connecting data sources, importing data, cleaning, and transforming data for Business intelligence Knowledge of database management principles and experience working with SQL/MySQL databases Ability to implement row-level security on data along with an understanding of application security layer models in Power BI Ability to translate business requirements into informative reports/visuals A good sense of design that will help communicate data using visually compelling reports and dashboards Experience in ETL (Extract, Transform and Load) processes an asset Experience in being involved in the development of a data warehouse is an asset Data analysis and visualization skills using Python and/or R an asset Strong analytical, problem-solving, and data analysis skills Ability to ensure organizational data privacy and confidentiality Understanding of statistical analysis techniques such as correlation and regression Demonstrated ability to collect data from a variety of sources, synthesize data, produce reports, and make recommendations Ability to manage multiple concurrent tasks and competing demands Education and Experience: Bachelors or masters degree in business, Information Systems, Computer Science, or related discipline Demonstrated experience in using Power BI to create reports, dashboards, and self serve analytics Must have 3+ years of experience in data-specific roles especially in the use of Power BI, Excel, and SQL
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
We are seeking a talented, enthusiastic, and hardworking individual who is passionate about automation, data engineering, and semiconductor technology to join our R&D team. This role is focused on automating integrated circuit (IC) design flows, optimizing data management, and enhancing workflow efficiency in analog/mixed-signal (AMS) design environments. As an Analog Automation Engineer, you will collaborate with IC design teams to streamline design, simulation, and verification processes. A fundamental understanding of integrated circuits and semiconductor technology, as well as experience using analog/mixed-signal CAD tools and programming in Python/Perl, is essential for success in this role. You will be responsible for collaborating with IC design engineers to identify bottlenecks and develop automation solutions to improve productivity. This includes optimizing design simulation, verification, and collateral generation through Python/Perl scripting and automation frameworks. Additionally, you will implement data engineering pipelines to process, clean, and analyze large datasets from IC design tools, and enhance data visualization and reporting to provide insights for design teams and management. Documenting automation workflows, database structures, and best practices for team collaboration and knowledge sharing will also be part of your responsibilities. Requirements for this position include a MASc or BASc with 3+ years of analog and mixed-signal IC design experience, strong Python programming skills with experience in workflow automation, proficiency in SQL databases and data engineering concepts, excellent analytical skills with a data-driven mindset, and strong communication and time management skills with the ability to work effectively in a team-oriented environment. Joining us means working in an innovative R&D team that is shaping the future of IC design automation, gaining experience with cutting-edge semiconductor and automation technologies, and growing in a collaborative and learning-driven environment. We offer a competitive salary, benefits, and career development opportunities. If you are a motivated engineer passionate about automation, data engineering, and semiconductor technology, we invite you to be part of our innovative team! Synopsys Canada ULC values the diversity of our workforce and is committed to providing access and opportunities to individuals with disabilities. Reasonable accommodations will be provided throughout the recruitment and employment process. For accommodation inquiries, please contact hr-help-canada@synopsys.com.,
Posted 6 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The role involves populating databases with information gathered from various sources and extracting data for analyst teams. You will be responsible for creating and populating templates, tables, figures, and graphics using tools like PowerPoint and Excel. Your tasks will include interpreting data provided by analysts and transforming them into professional presentations and deliverables. Additionally, you will process primary market research data, conduct statistical analysis, and ensure data quality through validation and cleaning processes. You will also be responsible for editing reports, slide decks, and other publications to maintain quality standards and consistency. To qualify for this position, you should have an undergraduate degree with a strong academic background, preferably in Business or Commerce. An advanced degree or relevant experience in sectors like biopharma, healthcare, or market research would be advantageous. Attention to detail, analytical skills, and proficiency with SPSS software are essential requirements. Strong communication skills, both written and verbal, are necessary, along with the ability to work collaboratively and independently. Proficiency in software tools like E-tabs, SQL, and Tableau is a plus, along with a proactive approach to work, critical thinking skills, and the ability to manage time effectively under pressure. The ideal candidate should be self-motivated, open to collaboration, and capable of adapting to new technologies and process improvements. Proficiency in Microsoft Office tools and experience in data analysis and visualization are desirable. While knowledge of the pharmaceutical or medical device industries is beneficial, it is not mandatory. The role requires the ability to handle multiple tasks in a fast-paced environment and excellent communication skills. The position may involve working in rotational shifts and meeting SLAs. If you are someone who thrives in a dynamic work environment, possesses strong analytical skills, and enjoys working with data to create impactful deliverables, this role offers an exciting opportunity to contribute to the success of client projects and drive business transformation.,
Posted 6 days ago
0.0 - 4.0 years
0 Lacs
maharashtra
On-site
As an Intern in the Insurance Domain, you will primarily be responsible for data management and data extraction tasks. Your role will involve aligning meetings, coordinating with clients for operational work, and ensuring the accuracy and completeness of data. The key tasks will include managing insurance-related data, extracting relevant information, and ensuring its availability for further analysis. You will be required to participate in meetings to align strategies and discuss progress. Additionally, you will collaborate with clients to coordinate various operational aspects and address their concerns effectively. This internship opportunity offers benefits such as paid time off and provident fund contribution. The work schedule will involve day shifts, morning shifts, and availability on weekends. The work location is in person, providing you with valuable hands-on experience in the insurance domain.,
Posted 6 days ago
1.0 - 5.0 years
0 Lacs
guwahati, assam
On-site
The job involves maintaining and updating databases with accurate data entry. You will be responsible for compiling and analyzing data as required by management. Additionally, you will handle data extraction, verification, and validation for reports. Organizing and maintaining office files and documentation will also be part of your responsibilities. You will assist in preparing reports, letters, and presentations as required. In this role, you will support in administrative tasks such as scheduling meetings, handling correspondence, and making travel arrangements for executives. Coordination with different departments to ensure smooth operation and timely communication is essential. Managing office supplies inventory and procurement will also be part of your duties. You will be responsible for ensuring accurate and timely billing processes. This is a full-time position suitable for freshers. The work schedule includes day and morning shifts. The job location is in person. The ideal candidate should have at least a Higher Secondary (12th Pass) education. Prior work experience of 1 year is preferred. The company offers a yearly bonus as part of the compensation package.,
Posted 6 days ago
2.0 - 4.0 years
25 - 30 Lacs
Bengaluru
Work from Office
* Design, build, and optimize data pipelines and workflows, primarily using Google BigQuery and related GCP tools * Work with structured and semi-structured data across relational and NoSQL database * Develop robust data transformation scripts using Python and SQL * Ensure data quality, integrity, and validation across ingestion and transformation processes * Collaborate with business and technical stakeholders to understand data requirements and deliver scalable solutions * Maintain clear and effective communication, managing stakeholder expectations proactively * Support data initiatives in industries such as Retail, CPG, and Manufacturing Qualifications and Education Requirements Must-Have: * Strong hands-on experience with Google BigQuery * Excellent proficiency in SQL and Python for data transformation * Deep understanding of relational data modeling * Experience with data extraction, loading (ETL/ELT), and transformation techniques * Experience with both Relational (e g , PostgreSQL, MySQL) and NoSQL databases * GCP Certification is a plus Nice to Have: * Exposure to other cloud platforms and hyperscaler tools (e g , AWS Redshift, Snowflake, Azure Synapse/Data Factory/Databricks, Oracle Exadata, Cosmos DB) * Knowledge of other GCP data tools such as Cloud Dataflow, Cloud Composer, Google Dataproc, or Looker * Experience with data quality assessment and data governance practices Soft Skills * Self-starter with a strong problem-solving mindset * Excellent written and verbal communication * Strong stakeholder management and collaboration skills * Team player with a proactive attitude
Posted 6 days ago
5.0 - 10.0 years
20 - 25 Lacs
Noida, Mumbai, Pune
Work from Office
Its fun to work in a company where people truly BELIEVE in what they are doing! Were committed to bringing passion and customer focus to the business. About the Role We are looking for a highly motivated and skilled Senior Data Scientist or Data Scientist to join our team. The ideal candidate will have a strong background in data science, machine learning, and statistical analysis, with hands-on experience in Python and industry-standard libraries. You will be responsible for deriving actionable insights, building predictive models, and effectively communicating findings through data storytelling. Key Responsibilities Develop, implement, and optimize machine learning models for predictive analytics and decision-making. Work with structured and unstructured data to extract meaningful insights and patterns. Utilize Python and standard data science libraries such as NumPy, Pandas, SciPy, Scikit-Learn, TensorFlow, PyTorch, and Matplotlib for data analysis and model building. Design and develop data pipelines for efficient processing and analysis. Conduct exploratory data analysis (EDA) to identify trends and anomalies. Collaborate with cross-functional teams to integrate data-driven solutions into business strategies. Use data visualization and storytelling techniques to communicate complex findings to non-technical stakeholders. Stay updated with the latest advancements in machine learning and AI technologies. Required Qualifications 5+ years or 8+ years of hands-on experience in data science and machine learning. Strong proficiency in Python and relevant data science packages. Experience with machine learning frameworks such as TensorFlow, Keras, or PyTorch. Knowledge of SQL and database management for data extraction and manipulation. Expertise in statistical analysis, hypothesis testing, and feature engineering. Understanding of Marketing Mix Modeling is a plus. Experience in data visualization tools such as Matplotlib, Seaborn, or Plotly package Strong problem-solving skills and ability to work with large datasets. Excellent communication skills with a knack for storytelling using data. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or GCP. Knowledge of big data technologies like Hadoop, Spark, or Databricks. Exposure to NLP, computer vision, or deep learning techniques. Understanding of A/B testing and experimental design. If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us! Not the right fit? Let us know youre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Posted 6 days ago
5.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
JD- Develop and maintain PLSQL procedures functions packages and triggers PL/SQL, Big Query,- Google Unix Shell Scripting. Experience in Javascript (Optional). Design and optimize complex SQL queries for data extraction and reporting Perform data modelling schema design and database tuning Collaborate with application developers to integrate backend logic with frontend applications Conduct unit testing and support system integration and user acceptance testing Monitor and troubleshoot database performance issues Maintain documentation for database structures and processes. Bigqueary, Datastage, Plsql, Sql, Unix Shell Scripting
Posted 6 days ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Job title: Senior Software Engineer Experience: 5- 8 years Primary skills: Python, Spark or Pyspark, DWH ETL. Database: SparkSQL or PostgreSQL Secondary skills: Databricks ( Delta Lake, Delta tables, Unity Catalog) Work Model: Hybrid (Weekly Twice) Cab Facility: Yes Work Timings: 10am to 7pm Interview Process: 3 rounds (3rd round F2F Mandatory) Work Location: Karle Town Tech Park Nagawara, Hebbal Bengaluru 560045 About Business Unit: The Architecture Team plays a pivotal role in the end-to-end design, governance, and strategic direction of product development within Epsilon People Cloud (EPC). As a centre of technical excellence, the team ensures that every product feature is engineered to meet the highest standards of scalability, security, performance, and maintainability. Their responsibilities span across architectural ownership of critical product features, driving techno-product leadership, enforcing architectural governance, and ensuring systems are built with scalability, security, and compliance in mind. They design multi cloud and hybrid cloud solutions that support seamless integration across diverse environments and contribute significantly to interoperability between EPC products and the broader enterprise ecosystem. The team fosters innovation and technical leadership while actively collaborating with key partners to align technology decisions with business goals. Through this, the Architecture Team ensures the delivery of future-ready, enterprise-grade, efficient and performant, secure and resilient platforms that form the backbone of Epsilon People Cloud. Why we are looking for you: You have experience working as a Data Engineer with strong database fundamentals and ETL background. You have experience working in a Data warehouse environment and dealing with data volume in terabytes and above. You have experience working in relation data systems, preferably PostgreSQL and SparkSQL. You have excellent designing and coding skills and can mentor a junior engineer in the team. You have excellent written and verbal communication skills. You are experienced and comfortable working with global clients You work well with teams and are able to work with multiple collaborators including clients, vendors and delivery teams. You are proficient with bug tracking and test management toolsets to support development processes such as CI/CD. What you will enjoy in this role: As part of the Epsilon Technology practice, the pace of the work matches the fast-evolving demands in the industry. You will get to work on the latest tools and technology and deal with data of petabyte-scale. Work on homegrown frameworks on Spark and Airflow etc. Exposure to Digital Marketing Domain where Epsilon is a marker leader. Understand and work closely with consumer data across different segments that will eventually provide insights into consumer behaviour's and patterns to design digital Ad strategies. As part of the dynamic team, you will have opportunities to innovate and put your recommendations forward. Using existing standard methodologies and defining as per evolving industry standards. Opportunity to work with Business, System and Delivery to build a solid foundation on Digital Marketing Domain. The open and transparent environment that values innovation and efficiency Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What will you do? Develop a deep understanding of the business context under which your team operates and present feature recommendations in an agile working environment. Lead, design and code solutions on and off database for ensuring application access to enable data-driven decision making for the company's multi-faceted ad serving operations. Working closely with Engineering resources across the globe to ensure enterprise data warehouse solutions and assets are actionable, accessible and evolving in lockstep with the needs of the ever-changing business model. This role requires deep expertise in spark and strong proficiency in ETL, SQL, and modern data engineering practices. Design, develop, and manage ETL/ELT pipelines in Databricks using PySpark/SparkSQL, integrating various data sources to support business operations Lead in the areas of solution design, code development, quality assurance, data modelling, business intelligence. Mentor Junior engineers in the team. Stay abreast of developments in the data world in terms of governance, quality and performance optimization. Able to have effective client meetings, understand deliverables, and drive successful outcomes. Qualifications: Bachelor's Degree in Computer Science or equivalent degree is required. 5 - 8 years of data engineering experience with expertise using Apache Spark and Databases (preferably Databricks) in marketing technologies and data management, and technical understanding in these areas. Monitor and tune Databricks workloads to ensure high performance and scalability, adapting to business needs as required. Solid experience in Basic and Advanced SQL writing and tuning. Experience with Python Solid understanding of CI/CD practices with experience in Git for version control and integration for spark data projects. Good understanding of Disaster Recovery and Business Continuity solutions Experience with scheduling applications with complex interdependencies, preferably Airflow Good experience in working with geographically and culturally diverse teams. Understanding of data management concepts in both traditional relational databases and big data lakehouse solutions such as Apache Hive, AWS Glue or Databricks. Excellent written and verbal communication skills. Ability to handle complex products. Good communication and problem-solving skills, with the ability to manage multiple priorities. Ability to diagnose and solve problems quickly. Diligent, able to multi-task, prioritize and able to quickly change priorities. Good time management. Good to have knowledge of cloud platforms (cloud security) and familiarity with Terraform or other infrastructure-as-code tools. About Epsilon: Epsilon is a global data, technology and services company that powers the marketing and advertising ecosystem. For decades, we have provided marketers from the world's leading brands the data, technology and services they need to engage consumers with 1 View, 1 Vision and 1 Voice. 1 View of their universe of potential buyers. 1 Vision for engaging each individual. And 1 Voice to harmonize engagement across paid, owned and earned channels. Epsilon's comprehensive portfolio of capabilities across our suite of digital media, messaging and loyalty solutions bridge the divide between marketing and advertising technology. We process 400+ billion consumer actions each day using advanced AI and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon has been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Epsilon is a global company with more than 9,000 employees around the world.
Posted 6 days ago
0.0 - 4.0 years
0 Lacs
pune, maharashtra
On-site
As an intern at Inspacco, your day-to-day responsibilities will include conducting online research to gather information on markets, competitors, and industry trends. You will use basic tools such as Google, company databases, and online directories to extract relevant data. Additionally, you will be responsible for maintaining and organizing collected data using Microsoft Excel. Furthermore, you will be expected to create clear and concise reports and dashboards based on data insights. Your role will also involve assisting in identifying growth opportunities, customer needs, and market gaps to contribute to the company's overall success. Inspacco, founded in 2019, is led by IIM alumni and armed forces professionals with the goal of providing affordable improvement and maintenance services. Serving as a one-stop solution for all residential, commercial, and industrial establishments, Inspacco has built trust with over 100 large customers within a year of its establishment. The company's mission is centered around delivering the highest quality products and services in the improvement and maintenance portfolio for residential, commercial, and industrial spaces at competitive prices. With a vision to become a leader in improvement and maintenance-related services, Inspacco aims to provide superior value to its customers.,
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Specialist in Product Strategy and Operations, you will play a crucial role in the development and execution of the Ethoca Clarity product. Your responsibilities will involve collaborating with various teams to translate requirements from Product Managers, customers, and partners into tangible product features. Your attention to detail and coordination skills will be key in ensuring the successful evolution of the product. You will be tasked with supporting core product workstreams by engaging with internal teams, vendors, and contractors to ensure timely delivery. Your role will also involve owning documentation, quality assurance processes, and tracking progress to provide internal stakeholders with visibility into project status and metrics. Additionally, you will lead status reviews, coordinate across different time zones, and assist in go-to-market strategies in collaboration with Operations, Engineering, and Market teams. Collaboration will be a central aspect of your role, as you will work closely with product management, development teams, customers, and market stakeholders to gather and validate requirements. Maintaining and prioritizing the product backlog to align with strategic goals will be essential, along with executing the product roadmap within Agile principles and managing iterations effectively. Furthermore, you will liaise with internal stakeholders such as Operations and Data Teams to ensure the successful delivery of product releases. Developing training materials, documentation, and user guides to drive product adoption will also be part of your responsibilities. Your proficiency in data extraction, transformation, and loading to support business intelligence and analytics, including managing data dashboards, will be advantageous in this role. To excel in this position, you should possess demonstrable experience in product discovery, execution, and commercialization, particularly with data-driven products. Experience in working with external vendors or delivery partners, managing timelines, output quality, and alignment to goals will be beneficial. Proficiency in SQL, Python, and/or Business Intelligence tools such as Tableau, Domo, or Power BI is desired. Strong analytical skills, problem-solving abilities, and effective written and verbal communication are essential traits for success in this role. You must be adaptable to a rapidly changing environment, open to learning and applying new technologies, and stay current with industry trends and advancements. A Bachelor's or Master's Degree in Product Management, Statistics, Data Science, or equivalent work experience is preferred for this position. As part of your corporate security responsibility, you are expected to abide by Mastercard's security policies and practices, maintain the confidentiality and integrity of accessed information, report any suspected information security violations or breaches, and undergo periodic mandatory security trainings as per Mastercard's guidelines.,
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Mumbai
Work from Office
We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Required Skills: 2+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines Preferred Qualifications: Experience with cloud platforms like Azure Familiarity with data lakes and data lakehouse architecture Exposure to CI/CD and DevOps practices in data engineering projects is an added advantage
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Hyderabad
Work from Office
Job Description Experience: 5 to 7 Years Work Location : Hyderabad As a BI Developer, you ll turn raw data into stunning visual stories that drive decisions. Collaborate with clients, create jaw-dropping dashboards, and lead end-to-end BI projects. If you love transforming insights into action and thrive in a vibrant consulting environment, we want you on our team! What You ll Tackle Each Day : End-to-End BI Implementation: Develop and manage the full BI lifecycle from data modelling and report building to delivery and post-implementation support. Tableau Development: Design, Develop, and maintain interactive and visually compelling dashboards and reports in Tableau. SQL Expertise: Write efficient SQL queries for data extraction, transformation, and analysis. PySpark experience is added advantage. Ability to independently manage end-to-end dashboard development projects with minimal supervision, taking full ownership of design, data integration, and deployment activities. Business Knowledge: Collaborate with clients to understand their business needs and provide actionable insights through BI solutions. Cross-Tool Integration: Experience with other BI tools such as Power BI or Qlik Sense. Qualifications 5 to 7 years of experience in Business Intelligence, focusing on Tableau development and SQL, you consistently deliver impactful BI solutions. A strong understanding of
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Mumbai
Work from Office
We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Required Skills: 2+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines Preferred Qualifications: Experience with cloud platforms like Azure Familiarity with data lakes and data lakehouse architecture Exposure to CI/CD and DevOps practices in data engineering projects is an added advantage
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Mumbai
Work from Office
We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Required Skills: 2+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines Preferred Qualifications: Experience with cloud platforms like Azure Familiarity with data lakes and data lakehouse architecture Exposure to CI/CD and DevOps practices in data engineering projects is an added advantage
Posted 1 week ago
15.0 - 20.0 years
45 - 50 Lacs
Chennai
Work from Office
Role Team Lead Enterprise Architecture and Operations The role requires you to be a Domain Expert/Business and Process Consultant with architecture know how possesses in-depth knowledge of the domain, business objectives and the processes that support those objectives. This role involves collaborating with enterprise teams to ensure alignment with these objectives. The role requires you to be a strong and impactful communicator, capable of addressing user requirements or concerns either directly or through group calls and community forums. It is essential that stakeholder requirements are adequately addressed during the Technical Application Maintenance and Operational Processes. He/she must be able to independently plan and organize end-user training sessions and motivate users towards compliance. About the UNIT/ Unit Overview IT Strategy, Governance & Security Location Chennai Experience: 15+ years Number of openings 1 What awaits you/ Job Profile Oversees the day-to-day operations of applications landscape, ensuring their smooth functioning. Primary upward interaction is with direct supervisor Demonstrates knowledge and understanding of the Enterprise Architecture. Empathizing with end users to understand the problem statement and resolve conflicts Communicating with stakeholders (business users, management, etc.) regarding application status, performance, and any issues or changes. Proficiency in using agile project management tools and software, such as [JIRA, Confluence, SharePoint] Business Application Operation: End-user support, end-user tickets, analysis and solving of tool issues and data issues, create imput for neccesary application maintenance activities Maintaining and developing documentation. Create and publish reports periodically Create and demonstrate meaningful and understandable data for end users to drive them towards compliance What should you bring along Minimum 15 years of experience in IT with a minimum of 6 years experience in experience in Enterprise Architecture Methodology (TOGAF certification is a good to have) and EAM Tools (e.g. Alfabet). Experience in SQL, Databases, Windows administration. Familiarity with DevOps practices and tools . Proven experience in application operation, user management, and incident resolution ITIL Certification with expertise in ITSM Service management (Incident, Change and Problem) Expertise in Data Analytics, data extraction, ETL, database research, and reporting Lookout for Continuous Service improvement initiatives. Excellent communication skills, preferably prior international experience or working directly with international clients Experience working with Agile Ecosystem Principles, timeboxing, roles and ceremonies Proficiency in using agile project management tools and software, such as [JIRA, Confluence, SharePoint]. Strong experience working in a continuous integration driven environment with proven track record of progressive thinking and continuous improvement. Work collaboratively with Product Owner, Enterprise Architecture Management, business, and other stakeholders incl. process partners Impactful communicator with strong willingness towards providing the best end-user experience Motivated to drive the organization towards compliance as driven by the IT Governance rules and instructions Well versed with stakeholder management Can independently plan, organize trainings, community sessions and workshops Must have technical skill Enterprise Architecture IT Service Management Data Analytics Stakeholder Management People Management Leadership Skills Impactful Communication Good to have Technical skills TOGAF Certification CSM or similar certification Prior Product Management/Project Management experience
Posted 1 week ago
5.0 - 7.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Experience: 5 to 7 Years Work Location : Hyderabad As a BI Developer, you ll turn raw data into stunning visual stories that drive decisions. Collaborate with clients, create jaw-dropping dashboards, and lead end-to-end BI projects. If you love transforming insights into action and thrive in a vibrant consulting environment, we want you on our team! What You ll Tackle Each Day : End-to-End BI Implementation: Develop and manage the full BI lifecycle from data modelling and report building to delivery and post-implementation support. Tableau Development: Design, Develop, and maintain interactive and visually compelling dashboards and reports in Tableau. SQL Expertise: Write efficient SQL queries for data extraction, transformation, and analysis. PySpark experience is added advantage. Ability to independently manage end-to-end dashboard development projects with minimal supervision, taking full ownership of design, data integration, and deployment activities. Business Knowledge: Collaborate with clients to understand their business needs and provide actionable insights through BI solutions. Cross-Tool Integration: Experience with other BI tools such as Power BI or Qlik Sense. 5 to 7 years of experience in Business Intelligence, focusing on Tableau development and SQL, you consistently deliver impactful BI solutions. A strong understanding of data visualization best p
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough