Jobs
Interviews

23992 Etl Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

This is a full-time on-site role for a SQL Developer located in Noida. As a SQL Developer, you will be responsible for database development, ETL (Extract Transform Load), database design, analytical skills, and data modeling on a day-to-day basis. You should possess a Bachelor's degree or equivalent in Computer Science or a related field, along with at least 4-10 years of industry experience. Experience in working with SQL relational database management systems is essential, and SQL scripting knowledge would be a plus. Your responsibilities will include creating interfaces for upstream/downstream applications, designing, building, testing, deploying, and scheduling the Integration process involving third party systems. In this role, you will be involved in designing and developing integrations using Boomi AtomSphere integration platform or Workforce Integration Manager or similar Integration Tools. Knowledge of Rest API, SOAP framework, XML, and Web service design would be beneficial. Strong oral and written communication skills, as well as good customer interfacing skills, are required for this position. Other responsibilities will include implementing software in various environments using Professional Services concepts, following the SDLC process to provide solutions for Interfaces, understanding client requirements, preparing design documents, coding, testing, deploying interfaces, providing User Acceptance Testing support, deploying and releasing to production environment, and handing off to global support.,

Posted 1 day ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

On-site

Who are we looking for? We are looking for 7+ years of administrator experience in MongoDB/Cassandra/Snowflake Databases. This role is focused on production support, ensuring database performance, availability, and reliability across multiple clusters. The ideal candidate will be responsible for ensuring the availability, performance, and security of our NoSQL database environment. You will provide 24/7 production support, troubleshoot issues, monitor system health, optimize performance, and collaborate with cross-functional teams to maintain a reliable and efficient Snowflake platform. Technical Skills Proven experience as a MongoDB/Cassandra/Snowflake Databases Administrator or similar role in production support environments. 7+ years of hands-on experience as a MongoDB DBA supporting production environments. Strong understanding of MongoDB architecture, including replica sets, sharding, and aggregation framework. Proficiency in writing and optimizing complex MongoDB queries and indexes. Experience with backup and recovery solutions (e.g., mongodump, mongorestore, Ops Manager). Solid knowledge of Linux/Unix systems and scripting (Shell, Python, or similar). Experience with monitoring tools like Prometheus, Grafana, DataStax OpsCenter, or similar. Understanding of distributed systems and high-availability concepts. Proficiency in troubleshooting cluster issues, performance tuning, and capacity planning. In-depth understanding of data management (e.g. permissions, recovery, security and monitoring Understanding of ETL/ELT tools and data integration patterns. Strong troubleshooting and problem-solving skills. Excellent communication and collaboration abilities. Ability to work in a 24/7 support rotation and handle urgent production issues. Strong understanding of relational database concepts. Experience with database design, modeling, and optimization is good to have Familiarity with data security is the best practice and backup : Support & Incident Management : Provide 24/7 support for MongoDB environments, including on-call rotation. Monitor system health and respond to s, incidents, and performance degradation issues. Troubleshoot and resolve production database issues in a timely manner. Database Administration Install, configure, and upgrade MongoDB clusters in on-prem or cloud environments. Perform routine maintenance including backups, restores, indexing, and data migration. Monitor and manage replica sets, sharding, and cluster Tuning & Optimization : Analyze query and indexing strategies to improve performance. Tune MongoDB server parameters and JVM settings where applicable. Monitor and optimize disk I/O, memory usage, and CPU utilization . Security & Compliance Implement and manage access control, roles, and authentication mechanisms (LDAP, x.509, SCRAM). Ensure encryption, auditing, and compliance with data governance and security & Monitoring : Create and maintain scripts for automation of routine tasks (e.g., backups, health and checks Set up and maintain monitoring tools (e.g., MongoDB Ops Manager, Prometheus/Grafana, MMS). Documentation & Collaboration Maintain documentation on architecture, configurations, procedures, and incident reports. Work closely with application and infrastructure teams to support new releases and : Experience with MongoDB Atlas and other cloud-managed MongoDB services. MongoDB certification (MongoDB Certified DBA Experience with automation tools like Ansible, Terraform, or Puppet. Understanding of DevOps practices and CI/CD integration. Familiarity with other NoSQL and RDBMS technologies is a plus. Education qualification : Any degree (ref:hirist.tech)

Posted 1 day ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Job Summary We are seeking a highly experienced and results-driven Senior ETL Developer with over 6 years of professional experience in data integration, transformation, and analytics across enterprise-grade data platforms. This role requires deep expertise in ETL development, strong familiarity with cloud-based data solutions, and the ability to manage large-scale data operations. The candidate should be capable of working across complex data environments, including structured and unstructured datasets, and demonstrate fluency in handling both traditional and modern cloud data ecosystems. The ideal candidate must have strong hands-on experience with ETL tools, advanced SQL and Python scripting, big data processing, and cloud-based data services, particularly within the AWS ecosystem. This position will play a key role in the design, development, and optimization of scalable data pipelines and contribute to enterprise-level data engineering solutions, while supporting analytical and reporting needs in both Application Development (AD) and Application Maintenance Support (AMS) environments. Key Responsibilities Design, develop, and maintain efficient and scalable ETL pipelines using modern data tools and platforms, focusing on extraction, transformation, and loading of large datasets from multiple sources. Work closely with data architects, analysts, and other stakeholders to understand business data requirements and translate them into robust technical ETL solutions. Implement and optimize data loading, transformation, cleansing, and integration strategies to ensure high performance and quality in downstream applications. Develop and manage cloud-based data platforms, particularly within the AWS ecosystem, including services such as Amazon S3, EMR, MSK, and SageMaker. Collaborate with cross-functional teams to integrate data from various databases such as Snowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL Server, and Cassandra. Employ scripting languages like SQL, PL/SQL, Python, and Unix shell commands to automate data transformations and monitoring processes. Leverage big data technologies such as Apache Spark and Sqoop to handle large-scale data workloads and enhance data processing capabilities. Support and contribute to data modeling initiatives using tools like Erwin and Oracle Data Modeler; exposure to Archimate will be considered an advantage. Work with scheduling and orchestration tools including Autosys, SFTP, and preferably Apache Airflow to manage ETL workflows efficiently. Troubleshoot and resolve data inconsistencies, data load failures, and performance issues across the data pipeline and cloud infrastructure. Follow best practices in data governance, metadata management, version control, and data quality frameworks to ensure compliance and consistency. Maintain documentation of ETL processes, data flows, and integration points for knowledge sharing and auditing purposes. Participate in code reviews, knowledge transfer sessions, and mentoring junior developers in ETL practices and cloud integrations. Stay up to date with evolving technologies and trends in data engineering, cloud services, and big data to proactively propose Technical Skills : ETL Tools : Experience with Talend is preferred (especially in AD and AMS functions), although it may be phased out in the Databases : Expertise in Snowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL Server, and Cassandra. Big Data & Cloud : Hands-on with Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon SageMaker, Apache Scripting : Strong skills in SQL, PL/SQL, Python; knowledge of Unix command-line is essential; R programming is optional but considered a Scheduling Tools : Working knowledge of Autosys, SFTP, and preferably Apache Airflow (training can be Data Modeling Tools : Proficiency in Erwin, Oracle Data Modeler; familiarity with Archimate is a preferred Notes : Power BI knowledge is relevant only in shared AD roles and not required for dedicated ETL and AWS roles or AMS responsibilities. The role requires strong communication skills to collaborate with technical and non-technical stakeholders, as well as a proactive mindset to identify and resolve data challenges. Must demonstrate the ability to adapt in fast-paced and changing environments while maintaining attention to detail and delivery quality. Exposure to enterprise data warehouse modernization, cloud migration projects, or real-time streaming data pipelines is considered highly advantageous. (ref:hirist.tech)

Posted 1 day ago

Apply

3.0 - 23.0 years

0 Lacs

ahmedabad, gujarat

On-site

Relay Human Cloud is a young and dynamic company dedicated to assisting some of the top US-based companies in expanding their teams internationally. With a truly global presence spanning across the US, India, Honduras, and Mexico, Relay is focused on facilitating connections with the best international talent. The company's core areas of expertise include Accounting & Finance, Administration, Operations, Space Planning, Leasing, Data Science, Data Search, Machine Learning, and Artificial Intelligence. Operating from offices in Ahmedabad and Vadodara, Relay India is committed to delivering high-quality operations with a focus on cutting-edge technologies. We are currently seeking a talented and dedicated Yardi Report Developer with a robust background in YSR reporting to join our team. In this role, you will be working closely with our US-based clients to design, develop, and maintain custom reports and data visualization solutions within the Yardi property management software. Your contributions will be instrumental in providing accurate insights to support decision-making and optimize property management operations. Key Responsibilities: - Develop and maintain custom YSR reports within the Yardi Voyager property management software. - Collaborate with business stakeholders to comprehend their reporting and data visualization requirements. - Design and develop dynamic and interactive reports and dashboards to deliver valuable insights. - Troubleshoot and address any issues related to report performance or data accuracy. - Create and update documentation for YSR reports and processes for future reference. - Keep abreast of Yardi software updates and new features, implementing them as necessary. - Assist in data extraction, transformation, and loading (ETL) processes to meet reporting needs. - Conduct Ad-hoc data analysis and reporting tasks as requested by management. - Provide training and support to end-users on YSR reporting capabilities and best practices. Qualifications: - Proficiency in English is essential due to direct interaction with US-based clients. - Bachelor's degree in computer science, Information Technology, or related fields (or equivalent work experience). - Extensive experience (2-3 years) in Yardi property management software with expertise in YSR reporting. - Strong understanding of SQL, data modeling, and data warehousing concepts. - Proficient in report development tools and technologies like Yardi Voyager, YSR, SSRS, Power BI, or similar. - Excellent problem-solving and analytical skills. - Detail-oriented with a focus on ensuring data accuracy and report quality. - Self-motivated and capable of working independently or collaboratively within a team. Preferred Qualifications: - Previous experience in the real estate or property management industry. - Familiarity with ETL tools and processes. - Knowledge of data visualization best practices.,

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Join us as an “Assistant VP" at Barclays, where you will be involved in functional design, data, end-to-end-process and controls, delivery, and functional testing. You’ll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To Be Successful In This Role, You Should Have Support Development of dashboards in SAP Analytics Cloud and Tableau ; prefer primary experience in SAP Analytics cloud and SAP related toolsets Able to develop process workflow and manage ETL tools like SAP BW, Alteryx etc Able to provide design solutions for Internal reporting problem statement and business requirements with quick delivery using tactical solutions and able to connect with the strategic roadmap as well To act as a Business analyst supporting the function thinking from a strategic point of view delivering MI views that enables analytics and supports quick decision making. To support business on an agile basis in delivering the requirements which is critical in dev ops model Build innovative dashboards on a sprint basis with key focus on controls and governance structure Able to visually enhance an analytical view from the legacy excel/PPT model Adhere to all the IR Controls and develop and implement robust controls mechanism in all processes managed Some Other Highly Valued Skills May Include Knowledge in Business Intellgence platforms primarily in SAP Analytics cloud and able to work in data management tools Project management /scrum master capabilities to drive prioritization Experience around designing MI dashboards and insights Broad business and industry knowledge and experience You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role will be based out of Chennai. Purpose of the role To develop business capabilities for Finance through key stages of functional design, data, end-to-end-process and controls, delivery, and functional testing. Accountabilities Functional Design: leveraging best practice concepts, and in collaboration with Line SMEs, support options analysis and recommendations as part of decision making. Data Analysis/Modelling/Governance: design conceptual data model underpinning all phases of the processes, and governance requirements in accordance with GDMS standards and principles. End-to-End Process & Controls - development of target process and controls design/documentation and operational runbooks and aligning these components with organizational and role/service model design definitions. . Delivery/Implementation Support: update design/functional requirements throughout the development cycle, and resolve RAIDS related to functional requirements and business processes. Project management for change programmes that have limited technology investment. Functional Testing: develop scripts and data to test alignment to requirement definitions, ahead of user testing cycles. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You will be joining our team as a Looker Enterprise Dashboarding Specialist. Your main responsibility will be to design, develop, and optimize Looker dashboards to extract actionable insights from complex datasets. To excel in this role, you should have a solid understanding of LookML, data modeling, SQL, and data visualization best practices. You will collaborate with data analysts, engineers, and business stakeholders to create impactful reports and dashboards. Your key responsibilities will include designing, developing, and maintaining Looker dashboards and reports to support business decision-making. You will also be tasked with building and optimizing LookML models, explores, and views to ensure efficient data querying. Collaborating with data engineering teams to enhance data pipelines and model performance will be essential. Working closely with business stakeholders to comprehend reporting needs and convert them into scalable Looker solutions is also a crucial part of your role. Implementing best practices for data visualization to ensure clear and effective storytelling will be a key aspect. Furthermore, optimizing dashboard performance, developing and maintaining data governance standards for Looker usage, and conducting training sessions for internal teams to enhance self-service analytics adoption will fall under your responsibilities. Staying abreast of Looker updates, new features, and industry best practices is also expected. To qualify for this position, you should have 3-5 years of experience in data visualization, business intelligence, or analytics. Strong expertise in Looker, LookML, and SQL is a must. Experience in data modeling, familiarity with BigQuery or other cloud data warehouses, understanding of data governance, security, and role-based access control in Looker, ability to optimize dashboards for performance and usability, strong problem-solving and analytical skills with attention to detail, and excellent communication and stakeholder management skills are necessary. Preferred qualifications include experience working with ETL pipelines and data transformation processes, familiarity with Python or other scripting languages for data automation, exposure to Google Cloud Platform (GCP) and data engineering concepts, and certifications in Looker, Google Cloud, or related BI tools.,

Posted 1 day ago

Apply

5.0 - 12.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You should have 5-12 years of experience in Big Data & Data related technologies. Your expertise should include a deep understanding of distributed computing principles and strong knowledge of Apache Spark. Proficiency in Python programming is required, along with experience using technologies such as Hadoop v2, Map Reduce, HDFS, Sqoop, Apache Storm, and Spark-Streaming for building stream-processing systems. You should have a good understanding of Big Data querying tools like Hive and Impala, as well as experience in integrating data from various sources such as RDBMS, ERP, and Files. Knowledge of SQL queries, joins, stored procedures, and relational schemas is essential. Experience with NoSQL databases like HBase, Cassandra, and MongoDB, along with ETL techniques and frameworks, is also expected. The role requires performance tuning of Spark Jobs, experience with AZURE Databricks, and the ability to efficiently lead a team. Designing and implementing Big Data solutions, as well as following AGILE methodology, are key aspects of this position.,

Posted 1 day ago

Apply

0.0 - 4.0 years

0 Lacs

haryana

On-site

As a Data Science Engineer Intern at V-Patrol AI, a dynamic and forward-thinking cybersecurity organization, you will play a crucial role in developing and implementing cutting-edge machine learning and deep learning models. Your primary focus will be on creating scalable data pipelines and generating valuable insights in real-time to counter cyber threats effectively. Your responsibilities will include designing and executing machine learning and deep learning models tailored for cybersecurity applications. Additionally, you will be involved in constructing and overseeing data pipelines for both structured and unstructured data sources such as network logs and threat feeds. Integrating APIs for model deployment and ensuring seamless real-time data flow will also be a key aspect of your role. Collaboration with software engineers, analysts, and stakeholders to support data-informed decision-making processes is essential. Monitoring model performance and optimizing them for production environments will be part of your routine tasks. Furthermore, you will be responsible for conveying your findings through informative dashboards, reports, and visualizations. To excel in this role, you should hold a Bachelor's or Master's degree in data science, computer science, statistics, or a related field. Proficiency in Python, pandas, scikit-learn, and TensorFlow/PyTorch is necessary. Hands-on experience with REST APIs, Fast API/Flask, and data preprocessing techniques is crucial. Familiarity with various ML/DL models like XGBoost, LSTMs, and Transformers is expected. Exposure to cloud platforms such as AWS/GCP, ETL tools, Docker/Kubernetes, etc., would be advantageous. While not mandatory, prior experience in cybersecurity, particularly in areas like threat detection and incident response, would be beneficial. In addition to the required skills and experience, expertise in adversarial machine learning and natural language processing (NLP) will be considered a significant advantage. Having a GitHub profile or a portfolio showcasing real-world projects in data science or cybersecurity will be a strong preference. This position is an internship opportunity that requires your presence at the designated work location.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The Software Engineer position is a part of the Delivery Excellence team in the Strategic Operations Department. Your primary responsibility will be to support GSC with Power BI Development, Reporting using BI, Data Analysis, and MIS across various Systems. Your key duties and responsibilities will include creating and managing Microsoft PowerBI Dashboards, utilizing Microsoft DAX, and managing JIRA Boards. You will also be expected to write SQL queries for data extraction and analysis, with a strong proficiency in writing complex SQL queries. Additionally, you will design, develop, deploy, and maintain BI interfaces, Containers - including data visualizations, dashboards, and reports using Power BI. Other responsibilities will include organizing backlogs in the Jira ticketing tool, monitoring and troubleshooting existing ETL jobs, BI models, dashboards, and reports, as well as troubleshooting and fixing failures/errors in data or dashboards. You should have exposure to Azure Delta lakes and other cloud offerings, along with a solid understanding of relational databases and strong SQL skills. You will be tasked with assembling, analyzing, and evaluating data to make appropriate recommendations and decisions to support business and project teams. Managing and overseeing 5-6 BI reporting projects simultaneously to ensure that KPIs aligned with each project are being reported will also be part of your responsibilities. Collaborating with cross-functional teams to populate data to BI Boards & Containers periodically is essential to this role. Ensuring data accuracy and integrity through regular reviews, data validation, troubleshooting, and documentation is crucial. You will also be expected to enhance efficiency through Lean methodologies, automation, and digital integration to improve processes. Staying up-to-date with industry trends and advancements in reporting and analytics tools and techniques is necessary, along with having fundamental knowledge about JIRA & Servicenow. Qualifications & Skills: - Bachelor's Degree in a relevant field; 3-4 Years of experience in BI Development, BI reporting, Business Analysis, DAX & other MIS - Intermediate to advanced skills in MS Suite of products - Ability to work on multiple tasks and self-manage deliverables, meetings, and information gathering - Excellent communication skills to present options and solutions in a way easily understood by business users.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

You will be working full-time from the office in Mumbai, Chennai, or Ahmedabad. As a PL/SQL DB Developer with 6 to 8 years of relevant experience, you will be expected to have a solid understanding of database concepts, stored procedures, functions/triggers, Unix, and ETL tools such as Data stage, Informatica, or SSIS. Your responsibilities will include hands-on experience in PL/SQL and Unix, along with strong communication skills and the ability to work well in a team. The key skills required for this role include proficiency in PLSQL, ETL, and Unix. The hiring process will involve screening by HR, followed by two technical rounds and a final HR round.,

Posted 1 day ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Where Data Does More. Join the Snowflake team. Snowflake’s Support team is expanding! We are looking for a Senior Cloud Support Engineer who likes working with data and solving a wide variety of issues utilizing their technical experience having worked on a variety of operating systems, database technologies, big data, data integration, connectors, and networking. Snowflake Support is committed to providing high-quality resolutions to help deliver data-driven business insights and results. We are a team of subject matter experts collectively working toward our customers’ success. We form partnerships with customers by listening, learning, and building connections. Snowflake’s values are key to our approach and success in delivering world-class Support. Putting customers first, acting with integrity, owning initiative and accountability, and getting it done are Snowflake's core values, which are reflected in everything we do. As a Senior Cloud Support Engineer , your role is to delight our customers with your passion and knowledge of Snowflake Data Warehouse. Customers will look to you for technical guidance and expert advice with regard to their effective and optimal use of Snowflake. You will be the voice of the customer regarding product feedback and improvements for Snowflake’s product and engineering teams. You will play an integral role in building knowledge within the team and be part of strategic initiatives for organizational and process improvements. Based on business needs, you may be assigned to work with one or more Snowflake Priority Support customers . You will develop a strong understanding of the customer’s use case and how they leverage the Snowflake platform. You will deliver exceptional service, enabling them to achieve the highest levels of continuity and performance from their Snowflake implementation. Ideally, you have worked in a 24x7 environment, handled technical case escalations and incident management, worked in technical support for an RDBMS, been on-call during weekends, and are familiar with database release management. AS A SENIOR CLOUD SUPPORT ENGINEER AT SNOWFLAKE, YOU WILL: Drive technical solutions to complex problems providing in-depth analysis and guidance to Snowflake customers and partners using the following methods of communication: email, web, and phone Adhere to response and resolution SLAs and escalation processes to ensure fast resolution of customer issues that exceed expectations Demonstrate good problem-solving skills and be process-oriented Utilize the Snowflake environment, connectors, 3rd party partner software, and tools to investigate issues Document known solutions to the internal and external knowledge base Report well-documented bugs and feature requests arising from customer-submitted requests Partner with engineering teams in prioritizing and resolving customer requests Participate in a variety of Support initiatives Provide support coverage during holidays and weekends based on business needs OUR IDEAL SENIOR CLOUD SUPPORT ENGINEER WILL HAVE: Bachelor’s. or Master’s degree in Computer Science or equivalent discipline. 5+ years experience in a Technical Support environment or a similar technical function in a customer-facing role Solid knowledge of at least one major RDBMS In-depth understanding of SQL data types, aggregations, and advanced functions including analytical/window functions A deep understanding of resource locks and experience with managing concurrent transactions Proven experience with query lifecycle, profiles, and execution/explain plans Demonstrated ability to analyze and tune query performance and provide detailed recommendations for performance improvement Advanced skills in interpreting SQL queries and execution workflow logic Proven ability with rewriting joins for optimization while maintaining logical consistency In-depth knowledge of various caching mechanisms and ability to take advantage of caching strategies to enhance performance Ability to interpret systems performance metrics (CPU, I/O, RAM, Network stats) Proficiency with JSON, XML, and other semi-structured data formats Proficient in database patch and release management NICE TO HAVES: Knowledge of distributed computing principles and frameworks (e.g., Hadoop, Spark) Scripting/coding experience in any programming language Database migration and ETL experience Ability to monitor and optimize cloud spending using cost management tools and strategies. SPECIAL REQUIREMENTS: Participate in pager duty rotations during nights, weekends, and holidays Ability to work the 4th/night shift which typically starts from 10 pm IST Applicants should be flexible with schedule changes to meet business needs Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Solution Architect at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Solution Architect you should have experience with: A very good broad understanding of a wide variety of technologies pertinent to Barclaycard, including emerging technologies. (e.g. AWS/Azure, Java, Adaptive and Responsive design, etc.) Awareness of IT Security patterns, considerations, best practice. Experience designing secure, scalable, highly available, resilient performant solutions. Knowledge of Software delivery and deployment patterns (e.g. Continuous Delivery, Continuous Integration, etc.) with deep understanding of Enterprise Container Platforms (e.g. Docker). Knowledge of different integration mechanisms (e.g. RESTful Web Services, ETL etc.). Awareness of different data solutions and data architecture best practice (e.g. Mongo, Data Driven Design, etc.). Awareness of SCM, packaging and build tools GIT, Jenkins and Maven Gradle Some Other Highly Valued Skills Include Payments/ Acquiring domain knowledge / experience. Good understanding of Customer Journeys in the Acquiring (Authorisations, Scheme Clearing, Scheme settlement, Merchant payments, Chargeback Processing). Familiar with integration and implementation issues and their architectural implications. Excellent understanding of best practice architectural and design methods with proven innovative and leading edge thinking (e.g. Domain Driven Architecture, event-based architecture, building for resilience, scalability, performance, Microservice design patterns etc.). Project Delivery - Understands different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. Service Delivery - Good understanding of concepts of service delivery and support and how this can be affected by technical delivery. Appreciation of different Infrastructure patterns (e.g. Internet Facing Environment, Operational Data Stores, DMZ, etc.). You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency. Accountabilities Design and development of solutions as products that can evolve, meeting business requirements that align with modern software engineering practices and automated delivery tooling. This includes identification and implementation of the technologies and platforms. Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities such as elasticity, serverless, containerisation etc. Best practice designs incorporating security principles (such as defence in depth and reduction of blast radius) that meet the Bank’s resiliency expectations. Solutions that appropriately balance risks and controls to deliver the agreed business and technology value. Adoption of standardised solutions where they fit. If no standard solutions fit, feed into their ongoing evolution where appropriate. Fault finding and performance issues support to operational support teams, leveraging available tooling. Solution design impact assessment in terms of risk, capacity and cost impact, inc. estimation of project change and ongoing run costs. Development of the requisite architecture inputs required to comply with the banks governance processes, including design artefacts required for architecture, privacy, security and records management governance processes. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Overview TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range 5+ years Education Required Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst position at our organization involves participating in the establishment and implementation of new or revised application systems and programs in collaboration with the Technology team. Your primary goal in this role will be to contribute to application systems analysis and programming activities. You will be responsible for hands-on experience in ETL and Big Data Testing, delivering high-quality solutions, proficient in Database & UI Testing using Automation tools, and knowledgeable in Performance, Volume & Stress testing. A strong understanding of SDLC / STLC processes, different types of manual Testing, and Agile methodology will be essential. You will be skilled in designing and executing test cases, authoring user stories, defect tracking, and aligning with business requirements. Being open to learning and implementing new innovations in automation processes according to project needs will be crucial. Your role will also involve managing complex tasks and teams, fostering a collaborative, growth-oriented environment through strong technical and analytical skills. You will utilize your knowledge of applications development procedures, concepts, and other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code. Familiarity with Test Management Tool JIRA and Automation Tools like Python, PySpark, Java, Spark, MySQL, Selenium, and Tosca is required, with experience in Hadoop / ABINTIO being a plus. Consulting with users, clients, and other technology groups on issues, and recommending programming solutions, installing, and supporting customer exposure systems will also be part of your responsibilities. Qualifications: - 4-8 years of relevant experience in the Financial Service industry - Intermediate level experience in Applications Development role - Clear and concise written and verbal communication skills - Demonstrated problem-solving and decision-making abilities - Ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements Education: - Bachelors degree/University degree or equivalent experience Please note that this job description provides a high-level overview of the work performed, and additional job-related duties may be assigned as required.,

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Sr Data Tester at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Sr Data Tester you should have experience with: Lead end-to-end testing of complex ETL workflows, with a strong focus on Ab Initio. Validate data transformations, integrations, and migrations across Data Warehousing environments. Design and execute test cases, test plans, test strategies for Ab Initio and AWS-based data solutions, ensuring compliance with cloud best practices. Write and optimize complex SQL queries for data validation and reconciliation. Perform root cause analysis and troubleshoot issues across Unix-based systems and cloud platforms. Collaborate with developers, analysts, and business stakeholders to ensure test coverage and traceability. Some Other Highly Valued Skills Include Graduate. Excellent communication and analytical skills. Skilled communicator at a wide variety of levels and capabilities. Collaborative and able to share best practice at all levels. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Test Automation Engineer at Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Test Automation Engineer you should have experience with: Hands-on experience in one or more technical skills under any of the technology platforms as below: Mainframe - COBOL, IMS, CICS, DB2, VSAM, JCL, TWS, File-Aid, REXX Open Systems and tools – Selenium, Java, Jenkins, J2EE, Web-services, APIs, XML, JSON, Parasoft/SoaTest –Service Virtualization API Testing Tools – SOAP UI , Postman, Insomnia. Mid-Tier technology – MQ, WebSphere, UNIX, API 3rd party hosting platforms Data warehouse – ETL, Informatica, Ab-initio, Oracle, Hadoop Good knowledge of API Architecture and API Concepts. Preferably have domain Knowledge in Retail Banking and testing experience in one or more core banking product platforms/systems such as Accounting and clearing / General Ledger, Savings & Insurance products, Online/mobile payments, Customer & Risk systems, Mortgages and payments. Experience in JIRA and similar test management tools. Test Automation Skills Hand on Experience of Test Automation using Java or any other Object Oriented Programming Language Hands on Experience of Automation Framework Creation and Optimization. Good understanding of Selenium, Appium, SeeTest, JQuery , Java Script and Cucumber. Working experience of Build tools like apache ant, maven, gradle. Knowledge/previous experience of dev ops and Continuous Integration using Jenkins, GIT, Dockers. Experience In API Automation Framework Like RestAssured , Karate. Experience in GitLab and or Gitlab Duo will be an added advantage. Some Other Highly Valued Skills May Include E2E Integration Testing and team leading Experience. Previous Barclays Experience. Understanding of Mainframes and Barclays Systems will be an added Advantage. Hands on Experience in Agile methodology. Domain/Testing/Technical certification will be an advantage You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

Fulcrum Digital is an agile and next-generation digital accelerating company providing digital transformation and technology services across various industries such as banking & financial services, insurance, retail, higher education, food, healthcare, and manufacturing. As a Data Quality Assurance Engineer, your main objectives will include hands-on experience in EDW source to target testing, data transformation/manipulation testing, data quality/completeness validation, ETL processes, and running processes through schedulers. You will be responsible for developing and executing comprehensive test plans to validate data within on-prem and cloud data warehouses, conducting thorough testing of ETL processes, dimensional data models, and reporting outputs, identifying and tracking data quality issues, and ensuring data consistency and integrity across different data sources and systems. Collaboration with the Data Quality and Data Engineering team is essential to define quality benchmarks and metrics, improve QA testing strategies, and implement best practices for data validation and error handling. You will work closely with various stakeholders to understand data requirements and deliverables, design and support testing infrastructure, provide detailed reports on data quality findings, and contribute insights to enhance data quality and processing efficiency. To be successful in this role, you should have a Bachelor's or Master's degree in computer science or equivalent, 2 to 3 years of experience in data warehouse development/testing, strong understanding of Data Warehouse & Data Quality fundamentals, and experience in SQL Server, SSIS, SSAS, and SSRS testing. Additionally, you should possess a great attention to detail, a result-driven test approach, excellent written and verbal communication skills, and willingness to take on challenges and provide off-hour support as needed. If you have a minimum of 2 to 3 years of Quality Assurance experience with a proven track record of improving Data Quality, experience with SSIS, MSSQL, Snowflake, and DBT, knowledge of QA automation tools, ETL processes, familiarity with cloud computing, and data ecosystem on Snowflake, you would be a great fit for this role. Desirable qualifications include knowledge of Insurance Data & its processes, data validation experience between on-prem & cloud architecture, and familiarity with hybrid data ecosystems.,

Posted 1 day ago

Apply

4.0 - 10.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

At EY, you will have the opportunity to shape a career as unique as you are, supported by a global network, inclusive culture, and cutting-edge technology to help you reach your full potential. Your individual perspective and voice are valued to contribute to the continuous improvement of EY. By joining us, you can create an outstanding experience for yourself while contributing to a more efficient and inclusive working world for all. As a Data Engineering Lead, you will work closely with the Data Architect to design and implement scalable data lake architecture and data pipelines. Your responsibilities will include designing and implementing scalable data lake architectures using Azure Data Lake services, developing and maintaining data pipelines for data ingestion from various sources, optimizing data storage and retrieval processes for efficiency and performance, ensuring data security and compliance with industry standards, collaborating with data scientists and analysts to enhance data accessibility, monitoring and troubleshooting data pipeline issues to ensure reliability, and documenting data lake designs, processes, and best practices. You should have experience with SQL and NoSQL databases, as well as familiarity with big data file formats such as Parquet and Avro. **Roles and Responsibilities:** **Must Have Skills:** - Azure Data Lake - Azure Synapse Analytics - Azure Data Factory - Azure DataBricks - Python (PySpark, Numpy, etc.) - SQL - ETL - Data warehousing - Azure DevOps - Experience in developing streaming pipelines using Azure Event Hub, Azure Stream Analytics, Spark streaming - Experience in integrating with business intelligence tools such as Power BI **Good To Have Skills:** - Big Data technologies (e.g., Hadoop, Spark) - Data security **General Skills:** - Experience with Agile and DevOps methodologies and the software development lifecycle - Proactive and accountable for deliverables - Ability to identify and escalate dependencies and risks - Proficient in working with DevOps tools with limited supervision - Timely completion of assigned tasks and regular status reporting - Capability to train new team members - Desired knowledge of cloud solutions like Azure or AWS with DevOps/Cloud certifications - Ability to work effectively with multicultural global teams and virtually - Strong relationship-building skills with project stakeholders Join EY in its mission to build a better working world by creating long-term value for clients, people, and society, and fostering trust in the capital markets. Leveraging data and technology, diverse EY teams across 150+ countries provide assurance and support clients in growth, transformation, and operations across various sectors. Through its services in assurance, consulting, law, strategy, tax, and transactions, EY teams strive to address complex global challenges by asking insightful questions to discover innovative solutions.,

Posted 1 day ago

Apply

3.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You should have hands-on experience with Celonis EMS (Execution Management System) and possess strong SQL skills for data extraction, transformation, and modeling. Proficiency in PQL (Process Query Language) for custom process analytics is essential, along with experience in integrating Celonis with SAP, Oracle, Salesforce, or other ERP/CRM systems. Having knowledge of ETL, data pipelines, and APIs (REST/SOAP) is crucial for this role. You should also demonstrate expertise in Process Mining & Analytical Skills, including understanding of business process modeling, process optimization techniques, and at least one OCPM project experience. Your responsibilities will include analyzing event logs to identify bottlenecks, inefficiencies, and automation opportunities. With 6-10 years of experience in the IT industry, focusing on Data Architecture/Business Process, and specifically 3-4 years of experience in process mining, data analytics, or business intelligence, you should be well-equipped for this position. A Celonis certification (e.g., Celonis Data Engineer, Business Analyst, or Solution Consultant) would be a plus. Any additional OCPM experience is also welcomed. Candidates who can join within 30-45 days will be given priority consideration for this role.,

Posted 1 day ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are Allvue Systems, the leading provider of software solutions for the Private Capital and Credit markets. Whether a client wants an end-to-end technology suite, or independently focused modules, Allvue helps eliminate the boundaries between systems, information, and people. We’re looking for ambitious, smart, and creative individuals to join our team and help our clients achieve their goals. Working at Allvue Systems means working with pioneers in the fintech industry. Our efforts are powered by innovative thinking and a desire to build adaptable financial software solutions that help our clients achieve even more. With our common goals of growth and innovation, whether you’re collaborating on a cutting-edge project or connecting over shared interests at an office happy hour, the passion is contagious. We want all of our team members to be open, accessible, curious and always learning. As a team, we take initiative, own outcomes, and have passion for what we do. With these pillars at the center of what we do, we strive for continuous improvement, excellent partnership and exceptional results. Come be a part of the team that’s revolutionizing the alternative investment industry. Define your own future with Allvue Systems! Design, implement, and maintain data pipelines that handle both batch and real-time data ingestion. Integrate various data sources (databases, APIs, third-party data) into Snowflake and other data systems. Work closely with data scientists and analysts to ensure data availability, quality, and performance. Troubleshoot and resolve issues related to data pipeline performance, scalability, and integrity. Optimize data processes for speed, scalability, and cost efficiency. Ensure data governance and security best practices are implemented Possesses a total experience of 5 to 8 years, including over 4+ years of expertise in data engineering or related roles. Strong experience with Snowflake, Kafka, and Debezium. Proficiency in SQL, Python, and ETL frameworks. Experience with data warehousing, data modeling, and pipeline optimization. Strong problem-solving skills and attention to detail. Experience in the financial services or fintech industry is highly desirable

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

The role of Sr Specialist Visualization & Automation in Hyderabad, India involves defining and driving the Platform engineering of Business intelligence solutions with a focus on Power BI technology. As a part of your responsibilities, you will use your strong Power BI skills to oversee the creation and management of BI and analytics solutions. You will be instrumental in driving the success of technology usage for solution delivery, best practices, standards definition, compliance, smooth transition to operations, improvements, and enablement of the business. Collaboration with the solution delivery lead and visualization lead on existing, new, and upcoming features, technology decisioning, and roadmap will be crucial. You will work closely with the solution architect and platform architect to define the visualization architecture pattern based on functional and non-functional requirements, considering available technical patterns. Additionally, you will define and drive the DevOps Roadmap to enable Agile ways of working, CI/CD pipeline, and automation for self-serve governance of the Power BI platform in collaboration with the Platform lead. It will be your accountability to ensure adherence with security and compliance policies and procedures, including Information Security & Compliance (ISC), Legal, ethics, and other compliance policies and procedures in defining architecture standards, patterns, and platform solutions. The role requires 8-10 years of IT experience in Data and Analytics, Visualization with a strong exposure to Power BI Solution delivery and Platform Automation in a global matrix organization. An in-depth understanding of database management systems, ETL, OLAP, data lake technologies, and experience in Power BI is essential. Knowledge of other visualization technologies is a plus. A specialization in the Pharma domain and understanding of data usage across the end-to-end enterprise value chain are advantageous. Good interpersonal, written and verbal communication skills, time management, and technical expertise aligned with Novartis Values & Behaviors are necessary. Join Novartis, a company committed to building an outstanding, inclusive work environment with diverse teams representative of the patients and communities served. Be a part of a mission to reimagine medicine and improve lives. If this role does not align with your career goals but you wish to stay connected with Novartis for future opportunities, join the Novartis Network. Explore the benefits, rewards, and the opportunity to create a brighter future together with Novartis.,

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a member of the infrastructure team at FIS, you will play a crucial role in troubleshooting and resolving technical issues related to Azure and SQL Server. Your responsibilities will include developing data solutions, understanding business requirements, and transforming data from different sources. You will design and implement ETL processes and collaborate with cross-functional teams to ensure that solutions meet business needs. To excel in this role, you should have a degree in Computer Science, a minimum of 4 years of experience, and proficient working knowledge of Azure, SQL, and ETL. Any programming language skills will be an asset, along with working knowledge of Data Warehousing, experience with JSON and XML data structures, and familiarity with working with APIs. At FIS, we offer a flexible and creative work environment where you can learn, grow, and make a real impact on your career. You will be part of a diverse and collaborative atmosphere, with access to professional and personal development resources. Additionally, you will have opportunities to volunteer and support charities, along with competitive salary and benefits. Please note that current and future sponsorship are not available for this position. FIS is committed to protecting the privacy and security of all personal information processed to provide services to our clients. For specific information on how FIS safeguards personal information online, please refer to the Online Privacy Notice. Recruitment at FIS primarily operates on a direct sourcing model, with a small portion of hires through recruitment agencies. FIS does not accept resumes from agencies not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings.,

Posted 1 day ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Senior Technical Lead at Barclays, where you'll have the opportunity to contribute to the evolution of our digital landscape, driving innovation and excellence. In this role, you will leverage cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. Your primary responsibility will be to deliver technology stack solutions, utilizing your strong analytical and problem-solving skills to understand business requirements and deliver high-quality solutions. Working collaboratively with fellow engineers, business analysts, and stakeholders, you will tackle complex technical issues that require detailed analytical skills and analysis. To excel as a Senior Technical Lead, you should possess experience in leading a team to perform complex tasks, using your professional knowledge and skills to deliver impactful work that influences the entire business function. You will be responsible for setting objectives, coaching employees to achieve those objectives, conducting performance appraisals, and determining reward outcomes. Whether you have leadership responsibilities or work as an individual contributor, you will lead collaborative assignments, guide team members, and identify new directions for projects to meet desired outcomes. Your role may involve consulting on complex issues, providing advice to leaders, identifying ways to mitigate risks, and developing new policies and procedures to enhance control and governance. You will take ownership of managing risks, strengthening controls, advising on decision-making, contributing to policy development, and ensuring operational effectiveness. Collaboration with other functions and business divisions will be essential to stay aligned with business strategies and activities. In addition to the above, some highly valued technical skills for this role include proficiency in ABINITIO and AWS, strong ETL and Data Integration background, experience in building complex ETL Data Pipelines, knowledge of data warehousing principles, Unix, SQL, basic AWS/Cloud Architecture, data modeling, ETL Scheduling, and strong data analysis skills. As a Senior Technical Lead, you will be assessed on key critical skills such as risk management, change and transformation, business acumen, strategic thinking, digital and technology expertise, along with job-specific technical skills. This role is based in Pune. **Purpose of the Role:** The purpose of this role is to design, develop, and enhance software using various engineering methodologies to provide business, platform, and technology capabilities for our customers and colleagues. **Key Accountabilities:** - Develop and deliver high-quality software solutions using industry-aligned programming languages, frameworks, and tools, ensuring scalability, maintainability, and performance optimization of the code. - Collaborate with product managers, designers, and engineers to define software requirements, devise solution strategies, and integrate software solutions with business objectives. - Participate in code reviews, promote a culture of code quality and knowledge sharing, and stay informed about industry technology trends to contribute to technical excellence. - Adhere to secure coding practices, implement effective unit testing practices, and ensure proper code design, readability, and reliability. **Assistant Vice President Expectations:** As an Assistant Vice President, you are expected to advise on decision-making, contribute to policy development, and ensure operational effectiveness. Collaboration with other functions and business divisions is crucial. Whether leading a team or working as an individual contributor, you will be accountable for delivering impactful work, coaching employees, and promoting a culture of excellence. **Barclays Values and Mindset:** All colleagues at Barclays are expected to embody the values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. The Senior Software Developer plays a crucial role in collaborating with the Product Manager, Implementation Consultants (ICs), and clients to understand requirements for meeting data analysis needs. This position requires good collaboration skills to provide guidance on analytics aspects to the team in various analytics-related activities. Experienced in Qlik Sense Architecture design and proficient in load script implementation and best practices. Hands-on experience in Qlik Sense development, dashboarding, data-modelling, and reporting techniques. Skilled in data integration through ETL processes from various sources and adept at data transformation including the creation of QVD files and set analysis. Capable of data modeling using Dimensional Modelling, Star schema, and Snowflake schema. The Senior Software Developer should possess strong SQL skills, particularly in SQL Server, to validate Qlik Sense dashboards and work on internal applications. Knowledge of deploying Qlik Sense applications using Qlik Management Console (QMC) is advantageous. Responsibilities include working with ICs, product managers, and clients to gather requirements, configuration, migration, and support of Qlik Sense applications, implementation of best practices, and staying updated on new technologies. Candidates for this role should hold a Bachelor of Science / BTech / MTech / Master of Science degree in Computer Science or equivalent work experience. Effective verbal and written communication skills are essential. Additionally, candidates are required to have a minimum of 3 - 5 years of experience in implementing end-to-end business intelligence using Qlik Sense, with thorough knowledge of Qlik Sense architecture, design, development, testing, and deployment processes. Understanding of Qlik Sense best practices, relational database concepts, data modeling, SQL code writing, and ETL procedures is crucial. Technical expertise in Qlik Sense, SQL Server, data modeling, and experience with clinical trial data and SDTM standards is beneficial. This position offers the opportunity to accelerate skills and career growth within a fast-growing company while contributing to the future of healthcare. eClinical Solutions fosters an inclusive culture that values diversity and encourages continuous learning and improvement. The company is an equal opportunity employer committed to making employment decisions based on qualifications, merit, culture fit, and business needs.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non-Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring knowledge of data engineering, cloud infrastructure, platform engineering, platform operations, and production support using ground-breaking cloud and big data technologies. The ideal candidate with 6-8 years of experience will possess strong technical skills, an eagerness to learn, a keen interest in Financial Crime, Financial Risk, and Compliance technology transformation, the ability to work collaboratively in a fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role, you will: - Ingest and provision raw datasets, enriched tables, and curated, re-usable data assets to enable a variety of use cases. - Drive improvements in the reliability and frequency of data ingestion, including increasing real-time coverage. - Support and enhance data ingestion infrastructure and pipelines. - Design and implement data pipelines to collect data from disparate sources across the enterprise and external sources and deliver it to the data platform. - Implement Extract Transform and Load (ETL) workflows, ensuring data availability at each stage in the data flow. - Identify and onboard data sources, conduct exploratory data analysis, and evaluate modern technologies, frameworks, and tools in the data engineering space. Core/Must-Have skills: - 3-8 years of expertise in designing and implementing data warehouses, data lakes using Oracle Tech Stack (ETL: ODI, SSIS, DB: PLSQL, and AWS Redshift). - Experience in managing data extraction, transformation, and loading various sources using Oracle Data Integrator and other tools like SSIS. - Database Design and Dimension modeling using Oracle PLSQL, Microsoft SQL Server. - Advanced working SQL Knowledge and experience with relational and NoSQL databases. - Strong analytical and critical thinking skills, expertise in data Modeling and DB Design, and experience building and optimizing data pipelines. Good to have: - Experience in Financial Crime, Financial Risk, and Compliance technology transformation domains. - Certification on any cloud tech stack preferred Microsoft Azure. - In-depth knowledge and hands-on experience with data engineering, Data Warehousing, and Delta Lake on-prem and cloud platforms. - Ability to script, code, query, and design systems for maintaining Azure/AWS Lakehouse, ETL processes, business Intelligence, and data ingestion pipelines. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies