Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
20 - 28 Lacs
Gurgaon
On-site
Job Title: Tableau Developer Location: Gurgaon (Work Form Office) Job Type: Full Time Role Experience Level: 8-12 Years Job Summary: We are seeking a talented Tableau Developer to join our Business Intelligence and Analytics team. The ideal candidate will be responsible for designing, developing, and maintaining visually compelling and insightful dashboards and reports using Tableau. You will work closely with business stakeholders to understand requirements, translate data into actionable insights, and support data-driven decision-making. Key Responsibilities: Design and develop interactive Tableau dashboards, visualizations, and reports based on business needs. Collaborate with business analysts, data engineers, and stakeholders to gather requirements and define KPIs. Optimize dashboard performance and usability. Write complex SQL queries to extract and transform data from various sources (e.g., SQL Server, Oracle, Snowflake). Conduct data validation and ensure data quality and accuracy. Schedule and publish dashboards to Tableau Server / Tableau Online for end-user access. Provide training, documentation, and support to business users. Required Skills and Qualifications: Bachelor’s degree in Computer Science, Information Systems, Statistics, or related field. 8-12+ years of hands-on experience with Tableau Desktop and Tableau Server. Proficiency in SQL for data manipulation and analysis. Strong understanding of data warehousing concepts and relational databases. Ability to analyze large datasets and turn them into meaningful visual insights. Experience with data blending, LOD (Level of Detail) expressions, filters, parameters, and calculated fields in Tableau. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Knowledge of ETL tools (e.g., Alteryx, Talend, Informatica) or scripting languages (Python, R). Understanding of data governance and security principles. Tableau certification (Desktop Specialist, Certified Associate, etc.) is a plus. Exposure to Agile methodologies. Job Type: Full-time Pay: ₹2,000,000.00 - ₹2,800,000.00 per year Work Location: In person
Posted 2 days ago
6.0 years
0 - 0 Lacs
Gurgaon
On-site
Job Description: We are seeking a highly skilled and experienced Senior BI Developer / SQL Developer to join our team. The ideal candidate will have strong proficiency in SQL, hands-on experience with BI tools, and a deep understanding of data modeling, ETL processes, and data warehousing concepts. You will work closely with cross-functional teams to design, develop, and maintain robust reporting and analytics solutions that support key business decisions. Key Responsibilities: Develop, maintain, and optimize complex SQL queries, stored procedures, and scripts across RDBMS such as MySQL or PostgreSQL. Design and build interactive dashboards and reports using BI tools such as Dundas BI , Power BI , Tableau , or Cognos . Translate business requirements into technical solutions using data modeling and database design best practices. Implement and support ETL processes to integrate data from various sources into data warehouses. Monitor and tune database performance, ensuring high availability and efficiency. Collaborate with business analysts, data engineers, and stakeholders to deliver high-quality, data-driven insights. Work in Agile/Scrum teams, actively participating in sprints, stand-ups, and retrospectives. Assist in migrating data and reporting solutions to cloud platforms like Azure or AWS . Provide documentation, training, and support to end-users on report usage and self-service BI tools. Ensure data integrity, security, and governance across reporting systems. Required Qualifications: Bachelor’s degree in Computer Science , Information Systems , Engineering , or a related field. 6+ years of experience as a Report Writer , BI Developer , or SQL Developer . Advanced proficiency in SQL and experience with MySQL, PostgreSQL, or similar RDBMS. Proven experience with BI/reporting tools like Dundas BI, Power BI, Tableau, or Cognos. Strong understanding of data modeling , relational database design , and data warehousing concepts. Familiarity with ETL tools and performance tuning of large datasets. Exposure to cloud environments such as Microsoft Azure or AWS is a plus. Excellent problem-solving and analytical skills with attention to detail. FOR IMMIDIATE RESPONSE SEND YOUR UPDATED CV TO: amrit@qapsoftware.com Job Type: Full-time Pay: ₹80,000.00 - ₹91,000.00 per month Application Question(s): How many years of experience you are having in IT ? How many years of experience you are having in RDBMS ? How many years of experience you are having in Data Modeling and Data Warehousing ? How many years of experience you are having in BI tools ? Work Location: In person
Posted 2 days ago
3.0 - 5.0 years
8 - 16 Lacs
Bengaluru
Hybrid
OVERVIEW The Data Engineer will work closely with clients and the eCS Biometrics team to optimize the elluminate platform for end-to-end solutions to aggregate, transform, access and report on clinical data throughout the life cycle of a clinical trial. This includes study design in elluminate, collaboration on specifications, and configuration of the various modules to including Data Central, Clinical Data Analytics and Trial Operational Analytics, Risk-Based Quality Management (RBQM), Statistical Computing Environment (SCE) and Operational Insights. The Data Engineer will be involved in standard ETL activities as well as programming custom listings, visualizations and analytics tools using Mapper and Qlik. The position involves a high level of quality control as well as adherence to standard operation procedures and work instructions and a constant drive towards automation and process improvement. KEY TASKS & RESPONSIBILITIES Design, develop, test, and deploy highly efficient code for supporting SDTM, Custom reports and Visualizations using tools like MS SQL, elluminate® Mapper and Qlik Configure ETL processes to support of the aggregation and standardization of clinical data from various sources including EDC systems, SAS and central laboratory vendors Work with Analytics developers, other team members and clients to review the business requirements and translate them into database objects and visualizations Manage multiple timelines and deliverables (for single or multiple clients) and managing client communications as assigned Provide diagnostic support and fix defects as needed Ensure compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures Other duties as assigned CANDIDATE’S PROFILE Education & Experience 3+ years of professional experience preferred Bachelor's degree or equivalent experience preferred Experience with database/warehouse architecture, design and development preferred Knowledge of various data platforms and warehouses including SQL Server, DB2, Teradata, AWS, Azure, Snowflake, etc. Understanding of Cloud / Hybrid data architecture concepts is a plus Knowledge of clinical trial data is a plus - CDISC ODM, SDTM, or ADAM standards Experience in Pharmaceutical/Biotechnology/Life Science industry is a plus Professional Skills Critical thinking, problem solving and strong initiative Communication and task management skills while working with technical and non-technical teams (both internal to eCS and clients) Must be team oriented with strong collaboration, prioritization, and adaptability skills Excellent knowledge of English; verbal and written communication skills with ability to interact with users and clients providing solutions Excited to learn new tools and product modules and adapt to changing technology and requirements Experience in the Life Sciences industry, CRO / Clinical Trial regulated environment preferred Technical Skills Proficient in SQL, T-SQL, PL/SQL programing Experience in Microsoft Office Applications, specifically MS Project and MS Excel Familiarity with multiple Database Platforms: Oracle, SQL Server, Teradata, DB2 Oracle Familiarity with Data Reporting Tools: QlikSense, QlikView, Spotfire, Tableau, JReview, Business Objects, Cognos, MicroStrategy, IBM DataStage, Informatica, Spark or related Familiarity with other languages and concepts: .NET, C#, Python, R, Java, HTML, SSRS, AWS, Azure, Spark, REST APIs, Big Data, ETL, Data Pipelines, Data Modelling, Data Analytics, BI, Data Warehouse, Data Lake or related
Posted 2 days ago
4.0 years
0 - 0 Lacs
Mohali
On-site
Job Description : Should have 4+ years hands-on experience in algorithms and implementation of analytics solutions in predictive analytics, text analytics and image analytics Should have handson experience in leading a team of data scientists, works closely with client’s technical team to plan, develop and execute on client requirements providing technical expertise and project leadership. Leads efforts to foster innovative ideas for developing high impact solutions. Evaluates and leads broad range of forward looking analytics initiatives, track emerging data science trends, and knowledge sharing Engaging key stakeholders to source, mine and validate data and findings and to confirm business logic and assumptions in order to draw conclusions. Helps in design and develop advanced analytic solutions across functional areas as per requirement/opportunities. Technical Role and Responsibilities Demonstrated strong capability in statistical/Mathematical modelling or Machine Learning or Artificial Intelligence Demonstrated skills in programming for implementation and deployment of algorithms preferably in Statistical/ML based programming languages in Python Sound Experience with traditional as well as modern statistical techniques, including Regression, Support Vector Machines, Regularization, Boosting, Random Forests, and other Ensemble Methods; Visualization tool experience - preferably with Tableau or Power BI Sound knowledge of ETL practices preferably spark in Data Bricks cloud big data technologies like AWS, Google, Microsoft, or Cloudera. Communicate complex quantitative analysis in a lucid, precise, clear and actionable insight. Developing new practices and methodologies using statistical methods, machine learning and predictive models under mentorship. Carrying out statistical and mathematical modelling, solving complex business problems and delivering innovative solutions using state of the art tools and cutting-edge technologies for big data & beyond. Preferred to have Bachelors/Masters in Statistics/Machine Learning/Data Science/Analytics Should be a Data Science Professional with a knack for solving problems using cutting-edge ML/DL techniques and implementing solutions leveraging cloud-based infrastructure. Should be strong in GCP, TensorFlow, Numpy, Pandas, Python, Auto ML, Big Query, Machine learning, Artificial intelligence, Deep Learning Exposure to below skills: Preferred Tech Skills : Python, Computer Vision,Machine Learning,RNN,Data Visualization,Natural Language Processing,Voice Modulation,Speech to text,Spicy,Lstm,Object Detection,Sklearn,Numpy, NLTk,Matplotlib,Cuinks, seaborn,Imageprocessing, NeuralNetwork,Yolo, DarkFlow,DarkNet,Pytorch, CNN,Tensorflow,Keras,Unet, ImageSegmentation,ModeNet OCR,OpenCV,Pandas,Scrapy, BeautifulSoup,LabelImg ,GIT. Machine Learning, Deep Learning, Computer Vision, Natural Language Processing,Statistics Programming Languages-Python Libraries & Software Packages- Tensorflow, Keras, OpenCV, Pillow, Scikit-Learn, Flask, Numpy, Pandas, Matplotlib,Docker Cloud Services- Compute Engine, GCP AI Platform, Cloud Storage, GCP AI & MLAPIs Job Types: Full-time, Permanent, Fresher Pay: ₹30,000.00 - ₹80,000.00 per month Education: Bachelor's (Preferred) Experience: Machine learning: 4 years (Preferred) Work Location: In person
Posted 2 days ago
3.0 years
0 Lacs
Delhi
Remote
Apache Superset Data Engineer Experience : 3 - 6 years Bhubaneswar, Delhi - NCR, Remote Working About the Job Featured The Apache Superset Data Engineer plays a key role in designing, developing, and maintaining scalable data pipelines and analytics infrastructure, with a primary emphasis on data visualization and dashboarding using Apache Superset. This role sits at the intersection of data engineering and business intelligence, enabling stakeholders to access accurate, actionable insights through intuitive dashboards and reports. Core Responsibilities Create, customize, and maintain interactive dashboards in Apache Superset to support KPIs, experimentation, and business insights Work closely with analysts, BI teams, and business users to gather requirements and deliver effective Superset-based visualizations Perform data validation, feature engineering, and exploratory data analysis to ensure data accuracy and integrity Analyze A/B test results and deliver insights that inform business strategies Establish and maintain standards for statistical testing, data validation, and analytical workflows Integrate Superset with various database systems (e.g., MySQL, PostgreSQL) and manage associated drivers and connections Ensure Superset deployments are secure, scalable, and high-performing Clearly communicate findings and recommendations to both technical and non-technical stakeholders Required Skills Proven expertise in building dashboards and visualizations using Apache Superset Strong command of SQL and experience working with relational databases like MySQL, or PostgreSQL Proficiency in Python (or Java) for data manipulation and workflow automation Solid understanding of data modelling, ETL/ELT pipelines, and data warehousing principles Excellent problem-solving skills and a keen eye for data quality and detail Strong communication skills, with the ability to simplify complex technical concepts for non-technical audiences Nice to have familiarity with cloud platforms (AWS, ECS) Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field 3+ yrs of relevant experience
Posted 2 days ago
0 years
0 Lacs
Delhi
On-site
Job requisition ID :: 84391 Date: Jun 16, 2025 Location: Delhi Designation: Consultant Entity: ETL
Posted 2 days ago
4.0 - 6.0 years
0 Lacs
Bhubaneshwar
On-site
Position: Data Migration Engineer (NV46FCT RM 3324) Required Qualifications: 4–6 years of experience in data migration, data integration, and ETL development. Hands-on experience with both relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases Experience in Google BigQuery for data ingestion, transformation, and performance optimization. Proficiency in SQL and scripting languages such as Python or Shell for custom ETL logic. Familiarity with ETL tools like Talend, Apache NiFi, Informatica, or AWS Glue. Experience working in cloud environments such as AWS, GCP, or Azure. Solid understanding of data modeling, schema design, and transformation best practices. Preferred Qualifications: Experience in BigQuery optimization, federated queries, and integration with external data sources. Exposure to data warehouses and lakes such as Redshift, Snowflake, or BigQuery. Experience with streaming data ingestion tools like Kafka, Debezium, or Google Dataflow. Familiarity with workflow orchestration tools such as Apache Airflow or DBT. Knowledge of data security, masking, encryption, and compliance requirements in migration scenarios. Soft Skills: Strong problem-solving and analytical mindset with high attention to data quality. Excellent communication and collaboration skills to work with engineering and client teams. Ability to handle complex migrations under tight deadlines with minimal supervision. ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: BhubaneshwarNoida Experience: 4-6 years Notice period: 0-30 days
Posted 2 days ago
5.0 years
0 Lacs
Orissa
Remote
No. of Positions: 1 Position: Lead Data Engineer Location: Hybrid or Remote Total Years of Experience: 5+ years Key Responsibilities: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented. Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes. Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies. Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations. Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses. Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills: This job has no supervisory responsibilities. Bachelor’s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work. 5+ years’ experience with a strong proficiency with SQL query/development skills. Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks. Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory). Experience working in the healthcare industry with PHI/PII. Creative, lateral, and critical thinker. Excellent communicator. Well-developed interpersonal skills. Good at prioritizing tasks and time management. Ability to describe, create and implement new solutions. Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef). Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau). Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.
Posted 2 days ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – GIG - Data Modeller EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. T he opportunity We’re looking for a candidate with 3-7 years of expertise in data science, data analysis and visualization skills.Act as Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects Your Key Responsibilities Lead and mentor a team throughout design, development and delivery phases and keep the team intact on high pressure situations. Work as a Senior team member to contribute in various technical streams EY DnA implementation project. Client focused with good presentation, communication and relationship building skills. Completion of assigned tasks on time and regular status reporting to the lead Collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques and validate the model results and articulate the insights to the business team Interface and communicate with the onsite teams directly to understand the requirement and determine the optimum solutions Create technical solutions as per business needs by translating their requirements and finding innovative solution options Provide product and design level functional and technical expertise along with best practices Get involved in business development activities like creating proof of concepts (POCs), point of views (POVs), assist in proposal writing and service offering development, and capable of developing creative power point content for presentations Participate in organization-level initiatives and operational activities Ensure continual knowledge management and contribute to internal L&D teams Building a quality work culture and Foster teamwork and lead by example Skills and attributes for success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint To qualify for the role, you must have BE/BTech/MCA/MBA with 3+ years of industry experience with machine learning, visualization, data science and related offerings. At least around 3+ years of experience in BI and Analytics. To be have ability to do end to end data solutions from analysis, mapping, profiling, ETL architecture and data modelling. Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Good experience using CA Erwin or other similar modelling tool is absolute must. Experience of working in Guidewire DataHub & InfoCenter skills. Strong knowledge of relational and dimensional data modelling concepts Develop logical and physical data flow models for ETL applications. Translate data access, transformation and movement requirements into functional requirements and mapping designs. Strong knowledge of data architecture, database structure , data analysis and SQL skills Experience in data management analysis. Analyse business objectives and evaluate data solutions to meet customer needs. Establishing scalable, efficient, automated processes for large scale data analyses and management Prepare and analyse historical data and identify patterns To collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques. To validate the model results and articulate the insights to the business team. Drive the Business requirements gathering for analytics projects Intellectual curiosity - eagerness to learn new things Experience with unstructured data is added advantage Ability to effectively visualize and communicate analysis results Experience with big data and cloud preferred Experience, interest and adaptability to working in an Agile delivery environment. Ability to work in a fast-paced environment where change is a constant and ability to handle ambiguous requirements Exceptional interpersonal and communication skills (written and verbal) Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about P&C insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance and Banking domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 days ago
0 years
0 Lacs
Noida
On-site
Posted On: 16 Jun 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description ETL, Shell / Python Scripting Hadoop Cloudera Data Lake Golden Source or Markit EDM Database Expertise DevOps CI/CD Experience Mandatory Competencies DevOps - Shell Scripting Python - Python ETL - Azure Data Factory Data on Cloud - Azure Data Lake (ADL) DevOps - CI/CD Database - PL SQL Database - Oracle Database - SQL Beh - Communication and collaboration Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.
Posted 2 days ago
7.0 years
6 - 7 Lacs
Noida
On-site
At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description: In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. Company Description At CoreLogic, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. CoreLogic is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity, and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. CoreLogic is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills, and directly impact the insurance marketplace. We know our people are our greatest asset. At CoreLogic, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property insurance and restoration industry. Description We are seeking a highly skilled Lead Data Analyst to join our Analytics team to serve customers across the property insurance and restoration industries. As a Lead Data Analyst you will play a crucial role in developing methods and models to inform data-driven decision processes resulting in improved business performance for both internal and external stakeholder groups. You will be responsible for interpreting complex data sets and providing valuable insights to enhance the value of data assets. The successful candidate will have a strong understanding of data mining techniques, methods of statistical analysis, and data visualization tools. This position offers an exciting opportunity to work in a dynamic environment, collaborating with cross-functional teams to support decision processes that will guide the respective industries into the future. Responsibilities Collaborate with cross-functional teams to understand and document requirements for analytics products. Serve as the primary point of contact for new data/analytics requests and support for customers. Lead a team of analysts to deliver client deliverables on a timely manner. Act as the domain expert and voice of the customer to internal stakeholders during the analytics development process. Develop and maintain an inventory of data, reporting, and analytic product deliverables for assigned customers. Work with customer success teams to establish and maintain appropriate customer expectations for analytics deliverables. Create and manage tickets on behalf of customers within internal frameworks. Ensure timely delivery of assets to customers and aid in the development of internal processes for the delivery of analytics deliverables. Work with IT/Infrastructure teams to provide customer access to assets and support internal audit processes to ensure data security. Create and optimize complex SQL queries for data extraction, transformation, and aggregation. Develop and maintain data models, dashboards, and reports to visualize data and track key performance metrics. Conduct validation checks and implement error handling mechanisms to ensure data reliability. Collaborate closely with stakeholders to align project goals with business needs and perform ad-hoc analysis to provide actionable recommendations. Analyze large and complex datasets to identify trends, patterns, and insights, and present findings and recommendations to stakeholders in a clear and concise manner Job Qualifications: 7+ years’ property insurance experience preferred 5+ years’ experience in management of mid-level professional teams or similar leadership position with a focus on data and/or performance management. Extensive experience in applying and/or developing performance management metrics for claims organizations. Bachelor’s degree in computer science, data science, statistics, or a related field is preferred. Mastery level knowledge of data analysis tools such as Excel, Tableau or Power BI. Demonstrated expertise in Power BI creating reports and dashboards, including the ability to connect to various data sources, prepare and model data, and create visualizations. Expert knowledge of DAX for creating calculated columns and measures to meet report-specific requirements. Expert knowledge of Power Query for importing, transforming, and shaping data. Proficiency in SQL with the ability to write complex queries and optimize performance. Strong knowledge of ETL processes, data pipeline and automation a plus. Proficiency in managing tasks with Jira is an advantage. Strong analytical and problem-solving skills. Excellent attention to detail and the ability to work with large datasets. Effective communication skills, both written and verbal. Excellent visual communications and storytelling with data skills. Ability to work independently and collaborate in a team environment. Cotality's Diversity Commitment: Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement: Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates
Posted 2 days ago
3.0 years
0 Lacs
India
Remote
Ready to be pushed beyond what you think you’re capable of? At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system. To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems. Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be. While many roles at Coinbase are remote-first, we are not remote-only. In-person participation is required throughout the year. Team and company-wide offsites are held multiple times annually to foster collaboration, connection, and alignment. Attendance is expected and fully supported. Team Coinbase is seeking a software engineer to join our India pod to drive the launch and growth of Coinbase in India. You will solve unique, large-scale, highly complex technical problems. You will help build the next generation of systems to make cryptocurrency accessible to everyone across multiple platforms (web, iOS, Android), operating real-time applications with high frequency and low latency updates, keeping the platform safe from fraud, enabling delightful experiences, and managing the most secure, containerized infrastructure running in the cloud. What you’ll be doing (i.e., job duties): Build high-performance services using Golang and gRPC, creating seamless integrations that elevate Coinbase's customer experience. Adopt, learn, and drive best practices in design techniques, coding, testing, documentation, monitoring, and alerting. Demonstrate a keen awareness of Coinbase’s platform, development practices, and various technical domains, and build upon them to efficiently deliver improvements across multiple teams. Add positive energy in every meeting and make your coworkers feel included in every interaction. Communicate across the company to both technical and non-technical leaders with ease. Deliver top-quality services in a tight timeframe by navigating seamlessly through uncertainties. Work with teams and teammates across multiple time zones. What we look for in you (i.e., job requirements): 3+ years of experience as a software engineer and 1+ years building backend services using Golang and gRPC. A self-starter capable of executing complex solutions with minimal guidance while ensuring efficiency and scalability. Proven experience integrating at least two third-party applications using Golang. Hands-on experience with AWS, Kubernetes, Terraform, Buildkite, or similar cloud infrastructure tools. Working knowledge of event-driven architectures (Kafka, MQ, etc.) and hands-on experience with SQL or NoSQL databases. Good understanding of gRPC, GraphQL, ETL pipelines, and modern development practices. Nice to haves: SaaS platform experience (Salesforce, Amazon Connect, Sprinklr). Experience with AWS, Kubernetes, Terraform, GitHub Actions, or similar tools. Familiarity with rate limiters, caching, metrics, logging, and debugging. Req ID - GCBE04IN Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying. Commitment to Equal Opportunity Coinbase is committed to diversity in its workforce and is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the Know Your Rights notice here . Additionally, Coinbase participates in the E-Verify program in certain locations, as required by law. Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations[at]coinbase.com to let us know the nature of your request and your contact information. For quick access to screen reading technology compatible with this site click here to download a free compatible screen reader (free step by step tutorial can be found here) . Global Data Privacy Notice for Job Candidates and Applicants Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available here. By submitting your application, you are agreeing to our use and processing of your data as required. For US applicants only, by submitting your application you are agreeing to arbitration of disputes as outlined here. Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Summary: We are seeking a highly skilled and experienced Data Scientist with a deep understanding of data analytics powered by artificial intelligence (AI) tools. The ideal candidate will be passionate about turning data into actionable insights using cutting-edge AI platforms, automation techniques, and advanced statistical methods. Key Responsibilities: Develop and deploy scalable AI-powered data analytics solutions for business intelligence, forecasting, and optimization. Leverage AI tools to automate data cleansing, feature engineering, model building, and visualization. Design and conduct advanced statistical analyses and machine learning models (supervised, unsupervised, NLP, etc.). Collaborate cross-functionally with engineering and business teams to drive data-first decision-making. Must-Have Skills & Qualifications: Minimum 4 years of professional experience in data science, analytics, or a related field. Proficiency in Python and/or R with strong hands-on experience in ML libraries (scikit-learn, XGBoost, TensorFlow, etc.). Expert knowledge of SQL and working with relational databases. Proven experience with data wrangling, data pipelines, and ETL processes. Deep Understanding of AI Tools for Data Analytics (Experience with several of the following is required): Data Preparation & Automation: Alteryx, Trifacta, KNIME AI/ML Platforms: DataRobot, H2O.ai, Amazon SageMaker, Azure ML Studio, Google Vertex AI Visualization & BI: Tableau, Power BI, Looker (with AI/ML integrations) AutoML & Predictive Modeling: Google AutoML, IBM Watson Studio, BigML NLP & Text Analytics: OpenAI (ChatGPT, Codex APIs), Hugging Face Transformers, MonkeyLearn Workflow Orchestration: Apache Airflow, Prefect Preferred Qualifications: Degree in Computer Science, Data Science, Statistics, or related field. Experience in cloud-based environments (AWS, GCP, Azure) for ML workloads. To apply, please send your resume to sooraj@superpe.in or shreya@superpe.in SuperPe is an equal opportunity employer and welcomes candidates of all backgrounds to apply. We look forward to hearing from you! Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Summary As a Java Developer for the Data and Analytics team, you will work within a Professional Services team to support our customer’s data migrations from legacy systems to Guidewire Cloud. You will also support the development of new tooling and methodology to streamline our migration process. Job Description You will work with our customers, partners, and other Guidewire team members to deliver successful migration programs utilizing our custom migration tools. You will utilize best practices for design, development and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics teams. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who can bring their array of experience working in previous Migration roles. You will cooperate closely with teams located around the world. Key Responsibilities You will deliver data migration projects for our customers accurately and on time You will work with the broader Guidewire data team to improve our internal processes and methodology You will participate in the creation of new tooling to support and streamline our data migration projects when called upon or when the opportunity presents itself You are a systematic problem-solver who takes ownership of your projects and does not shy away from the hard problems. You are driven to success and accept nothing less from yourself. You consistently display the ability to work independently in a fast-paced Agile environment. Flexibility to do shift work as needed (aligning to AMER/APAC colleagues/customers). Qualifications Bachelor’s or Master’s Degree in Computer Science, or equivalent level of demonstrable professional competency, and 3-5 years + in delivery type role Development experience using Java (or other Object-Oriented language) preferred Experience developing and deploying production REST APIs Familiarity with data processing and ETL (Extract, Transform, Load) concepts. Experience working with relational and/or NoSQL databases Proficiency in SQL, Data Modeling, ETL/ELT, and cloud computing skills. Experience working with customer teams to understand business objectives and functional requirements. Effective leadership, interpersonal, and communication skills. Ability to work independently and within a team. Nice To Have Insurance industry experience Experience with the Guidewire InsuranceSuite Guidewire ACE Certification Experience in Data Migration About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. For more information, please visit www.guidewire.com and follow us on Twitter: @Guidewire_PandC. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where it's applicable to the position. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Overview Primary focus would be to perform development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. This role will also have L3 responsibilities for ETL processes Responsibilities Delivery of key Azure Data Lake projects within time and budget Contribute to solution design and build to ensure scalability, performance and reuse of data and other components Delivery of key Azure Data Lake projects within time and budget Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Possess strong problem-solving abilities with a focus on managing to business outcomes through collaboration with multiple internal and external parties Enthusiastic, willing, able to learn and continuously develop skills and techniques - enjoys change and seeks continuous improvement A clear communicator both written and verbal with good presentational skills, fluent and proficient in the English language Customer focused and a team player Qualifications Bachelor’s degree in Computer Science, MIS, Business Management, or related field 5+ years’ experience in Information Technology 4+ years’ experience in Azure Data Lake Bachelor’s degree in Computer Science, MIS, Business Management, or related field Technical Skills : Proven experience development activities in Data, BI or Analytics projects Solutions Delivery experience - knowledge of system development lifecycle, integration, and sustainability Strong knowledge of Pyspark and SQL Good knowledge of Azure data factory or Databricks Knowledge of Presto / Denodo is desirable Knowledge of FMCG business processes is desirable Non-Technical Skills : Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation Show more Show less
Posted 2 days ago
3.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Grade Level (for internal use): 09 The Role: As a Software Developer with the Data & Research Development team, you will be responsible for developing & providing backend support across a variety of products within the Market Intelligence platform. Together, you will build scalable and robust solutions using AGILE development methodologies with a focus on high availability to end users. The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Deliver solutions within a multi-functional Agile team Develop expertise in our proprietary enterprise software products Set and maintain a level of excitement in using various technologies to develop, support, and iteratively deploy real enterprise level software Achieve an understanding of customer environments and their use of the products Build solutions architecture, algorithms, and designs for solutions that scale to the customer's enterprise/global requirements Apply software engineering practices and implement automation across all elements of solution delivery Basic Qualifications What we’re looking for: 3-6 years of desktop application development experience with deep understanding of Design Patterns & Object-oriented programming. Hands on development experience using C#, .Net 4.0/4.5, WPF, Asp.net, SQL server. Strong OOP and Service Oriented Architecture (SOA) knowledge. Strong understanding of cloud applications (Containers, Dockers etc.) and exposure to data ETL will be a plus. Ability to resolve serious performance related issues through various techniques, including testing, debugging and profiling. Strong problem solving, analytical and communication skills. Possess a true “roll up the sleeves and get it done” working approach; demonstrated success as a problem solver, operating as a client-focused self-starter. Preferred Qualifications Bachelor's degree in computer science or computer engineering About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 313152 Posted On: 2025-05-05 Location: Hyderabad, Telangana, India Show more Show less
Posted 2 days ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Overview Deputy Director - Data Engineering PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Data engineering lead role for D&Ai data modernization (MDIP) Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The can didate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of engineering and data management experience Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Show more Show less
Posted 2 days ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills. Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA). soft skills. Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills And Attributes For Success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Experience with CI/CD pipelines for data workflows in Azure DevOps Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 8-11 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills Nice to have: Knowledge in data security best practices Knowledge in Data Architecture Design Patterns What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Work with Match360, Publisher, and Watsonx integrations to modernize MDM workloads Drive architectural decisions and ensure alignment with product roadmaps and enterprise standards Secondary: Informatica MDM (Desirable Skillset) Understand Key Concepts Of Informatica MDM Including: Landing, staging, base objects, trust & match rules Hierarchy configuration, E360 views, and SIF/REST API integrations Support data ingestion processes (batch & real-time), transformation, and cleansing routines via IDQ and Java-based user exits Provide insights and inputs to help us strategically position IBM MDM against Informatica, shaping unique assets and accelerators Cross-Functional and Strategic Responsibilities Collaborate with data governance and business teams to implement DQ rules, lineage, and business glossaries Mentor junior developers; participate in design/code reviews and knowledge-sharing sessions Create and maintain documentation: architecture diagrams, integration blueprints, solution specs Stay current with modern MDM practices, AI/ML in data mastering, and cloud-first platforms (e.g., CP4D, IICS, Snowflake, Databricks) Experience with other database platforms and technologies (e.g., DB2,Oracle, SQL Server). Experience with containerization technologies (e.g., Docker, Kubernetes) and orchestration tools. Knowledge of database regulatory compliance requirements (e.g., GDPR, HIPAA). Your Role And Responsibilities We are seeking an experienced and self driven Senior MDM Consultant to design, develop, and maintain enterprise-grade Master Data Management solutions with a primary focus on IBM MDM and foundational knowledge of Informatica MDM. This role will play a key part in advancing our data governance, quality, and integration strategies across customer, product, and party domains. Having experience in IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta is important. You will work closely with cross-functional teams including Data Governance, Source System Owners, and Business Data Stewards to implement robust MDM solutions that ensure consistency, accuracy, and trustworthiness of enterprise data. Strong Hands-on Experience With: Informatica MDM 10.x, IDQ, and Java-based user exits. MDM components: base/landing/staging tables, relationships, mappings, hierarchy, E360 Informatica PowerCenter, IICS, or similar ETL tools Experience with REST APIs, SOA, event-based integrations, and SQL/RDBMS. Familiarity with IBM MDM core knowledge in matching, stewardship UI, workflows, and metadata management. Excellent understanding of data architecture, governance, data supply chain, and lifecycle management. Strong communication, documentation, and stakeholder management skills. Experience with cloud MDM/SaaS solutions and DevOps automation for MDM deployments. Knowledge of BAW, Consent Management, Account & Macro Role configuration. Preferred Education Bachelor's Degree Required Technical And Professional Expertise We are seeking an experienced and self driven Senior MDM Consultant to design, develop, and maintain enterprise-grade Master Data Management solutions with a primary focus on IBM MDM and foundational knowledge of Informatica MDM. This role will play a key part in advancing our data governance, quality, and integration strategies across customer, product, and party domains. Having experience in IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta is important. You will work closely with cross-functional teams including Data Governance, Source System Owners, and Business Data Stewards to implement robust MDM solutions that ensure consistency, accuracy, and trustworthiness of enterprise data. Strong Hands-on Experience With: Informatica MDM 10.x, IDQ, and Java-based user exits MDM components: base/landing/staging tables, relationships, mappings, hierarchy, E360 Informatica PowerCenter, IICS, or similar ETL tools Experience with REST APIs, SOA, event-based integrations, and SQL/RDBMS. Familiarity with IBM MDM core knowledge in matching, stewardship UI, workflows, and metadata management. Excellent understanding of data architecture, governance, data supply chain, and lifecycle management. Strong communication, documentation, and stakeholder management skills. Experience with cloud MDM/SaaS solutions and DevOps automation for MDM deployments. Knowledge of BAW, Consent Management, Account & Macro Role configuration. Preferred Technical And Professional Experience Other required skills: IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta Show more Show less
Posted 2 days ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA Show more Show less
Posted 2 days ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – GIG - Data Modeller EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. T he opportunity We’re looking for a candidate with 3-7 years of expertise in data science, data analysis and visualization skills.Act as Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects Your Key Responsibilities Lead and mentor a team throughout design, development and delivery phases and keep the team intact on high pressure situations. Work as a Senior team member to contribute in various technical streams EY DnA implementation project. Client focused with good presentation, communication and relationship building skills. Completion of assigned tasks on time and regular status reporting to the lead Collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques and validate the model results and articulate the insights to the business team Interface and communicate with the onsite teams directly to understand the requirement and determine the optimum solutions Create technical solutions as per business needs by translating their requirements and finding innovative solution options Provide product and design level functional and technical expertise along with best practices Get involved in business development activities like creating proof of concepts (POCs), point of views (POVs), assist in proposal writing and service offering development, and capable of developing creative power point content for presentations Participate in organization-level initiatives and operational activities Ensure continual knowledge management and contribute to internal L&D teams Building a quality work culture and Foster teamwork and lead by example Skills and attributes for success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint To qualify for the role, you must have BE/BTech/MCA/MBA with 3+ years of industry experience with machine learning, visualization, data science and related offerings. At least around 3+ years of experience in BI and Analytics. To be have ability to do end to end data solutions from analysis, mapping, profiling, ETL architecture and data modelling. Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Good experience using CA Erwin or other similar modelling tool is absolute must. Experience of working in Guidewire DataHub & InfoCenter skills. Strong knowledge of relational and dimensional data modelling concepts Develop logical and physical data flow models for ETL applications. Translate data access, transformation and movement requirements into functional requirements and mapping designs. Strong knowledge of data architecture, database structure , data analysis and SQL skills Experience in data management analysis. Analyse business objectives and evaluate data solutions to meet customer needs. Establishing scalable, efficient, automated processes for large scale data analyses and management Prepare and analyse historical data and identify patterns To collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques. To validate the model results and articulate the insights to the business team. Drive the Business requirements gathering for analytics projects Intellectual curiosity - eagerness to learn new things Experience with unstructured data is added advantage Ability to effectively visualize and communicate analysis results Experience with big data and cloud preferred Experience, interest and adaptability to working in an Agile delivery environment. Ability to work in a fast-paced environment where change is a constant and ability to handle ambiguous requirements Exceptional interpersonal and communication skills (written and verbal) Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about P&C insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance and Banking domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 days ago
4.0 - 6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title: SQL Database Administrator Location: Nippon Q1 Business Centre, Kochi Experience: 4-6 Years Type: Full-time About Us: Expericia Technologies is a fast-growing IT services company specializing in enterprise applications, custom software development, SharePoint, .NET, Azure, React and more. We're passionate about technology, innovation and building solutions that resolve real-world problems. Join us to work on exciting projects, learn directly from industry veterans, and grow your career the right way. About the Role: We are looking for an experienced SQL SDA (SQL Developer/Administrator) with 4-6 years of expertise in database management, performance optimization, and ETL processes. The ideal candidate will handle database administration tasks, optimize performance, and support analytics by developing ETL processes for data transformation and reporting. Key Responsibilities: · Design, develop, and optimize SQL queries , stored procedures, and reports. · Perform data analysis and support decision-making with accurate, efficient reports. · Collaborate with business teams to provide tailored database solutions. · Optimize SQL queries and database performance, including troubleshooting and tuning. · Administer and manage SQL Server databases, ensuring availability, security, and data integrity. · Implement and manage ETL processes for data extraction, transformation, and loading. · Develop and maintain dashboards and reporting solutions using SQL and ETL tools. · Ensure data quality and troubleshoot any ETL-related issues. · Support database migrations, upgrades, and high-availability configurations. Skills and Qualifications: · 4-6 years of experience in SQL development and administration, with a focus on ETL processes. · Strong expertise in T-SQL, SQL Server, and ETL tools (e.g., SSIS, Talend). · Proficient in database performance tuning, query optimization, and backup/recovery strategies. · Strong problem-solving and analytical skills. · Bachelor’s degree in Computer Science or related field. Preferred Qualifications: · Experience with cloud migration and data warehousing solutions. · Experience with cloud platforms (AWS RDS, Azure SQL) · Familiarity with high-availability configurations and data integration. Show more Show less
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Hi All, We are hiring for our investment banking client in Mumbai powai location. Location: Mumbai—locals only. Experience: 4-8 years Budget: Open Competitive Market rate [always keep it low] Interview Mode: 1st Round -Virtual, 2nd/3rd -compulsory face to face, may have more than 3 rounds. Required Details: Total Experience Relevant Experience Current Company: Current Designation: Current CTC Expected CTC Notice Period: Current Location Expected Location: Offer In hand: Reason for Job Change: Degree CGPA Passed Out: JD: Requirements (indicate mandatory and/or preferred): Mandatory Must have extensive development experience in Informatica. Sound knowledge of Transformation, mapping and workflow . Good knowledge of relational database (MSSQL/Oracle) & SQL Good knowledge of Oracle/SQL stored procedure / packages / functions Good knowledge of Unix shell scripting Good communication skills and must be able to interact at all levels on a wide range of issues. Must adapt to dynamic business requirements that alter project flowsFlexible for changes and ability to multi-tasks Hard working and self-motivated person Preferred Investment Banking domain knowledge Proactive and willing to learn Knowledge of Autosys Knowledge of Python Show more Show less
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.