Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Test Automation Engineer at Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Test Automation Engineer you should have experience with: Hands-on experience in one or more technical skills under any of the technology platforms as below: Mainframe - COBOL, IMS, CICS, DB2, VSAM, JCL, TWS, File-Aid, REXX Open Systems and tools – Selenium, Java, Jenkins, J2EE, Web-services, APIs, XML, JSON, Parasoft/SoaTest –Service Virtualization API Testing Tools – SOAP UI , Postman, Insomnia. Mid-Tier technology – MQ, WebSphere, UNIX, API 3rd party hosting platforms Data warehouse – ETL, Informatica, Ab-initio, Oracle, Hadoop Good knowledge of API Architecture and API Concepts. Preferably have domain Knowledge in Retail Banking and testing experience in one or more core banking product platforms/systems such as Accounting and clearing / General Ledger, Savings & Insurance products, Online/mobile payments, Customer & Risk systems, Mortgages and payments. Experience in JIRA and similar test management tools. Test Automation Skills Hand on Experience of Test Automation using Java or any other Object Oriented Programming Language Hands on Experience of Automation Framework Creation and Optimization. Good understanding of Selenium, Appium, SeeTest, JQuery , Java Script and Cucumber. Working experience of Build tools like apache ant, maven, gradle. Knowledge/previous experience of dev ops and Continuous Integration using Jenkins, GIT, Dockers. Experience In API Automation Framework Like RestAssured , Karate. Experience in GitLab and or Gitlab Duo will be an added advantage. Some Other Highly Valued Skills May Include E2E Integration Testing and team leading Experience. Previous Barclays Experience. Understanding of Mainframes and Barclays Systems will be an added Advantage. Hands on Experience in Agile methodology. Domain/Testing/Technical certification will be an advantage You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
Fulcrum Digital is an agile and next-generation digital accelerating company providing digital transformation and technology services across various industries such as banking & financial services, insurance, retail, higher education, food, healthcare, and manufacturing. As a Data Quality Assurance Engineer, your main objectives will include hands-on experience in EDW source to target testing, data transformation/manipulation testing, data quality/completeness validation, ETL processes, and running processes through schedulers. You will be responsible for developing and executing comprehensive test plans to validate data within on-prem and cloud data warehouses, conducting thorough testing of ETL processes, dimensional data models, and reporting outputs, identifying and tracking data quality issues, and ensuring data consistency and integrity across different data sources and systems. Collaboration with the Data Quality and Data Engineering team is essential to define quality benchmarks and metrics, improve QA testing strategies, and implement best practices for data validation and error handling. You will work closely with various stakeholders to understand data requirements and deliverables, design and support testing infrastructure, provide detailed reports on data quality findings, and contribute insights to enhance data quality and processing efficiency. To be successful in this role, you should have a Bachelor's or Master's degree in computer science or equivalent, 2 to 3 years of experience in data warehouse development/testing, strong understanding of Data Warehouse & Data Quality fundamentals, and experience in SQL Server, SSIS, SSAS, and SSRS testing. Additionally, you should possess a great attention to detail, a result-driven test approach, excellent written and verbal communication skills, and willingness to take on challenges and provide off-hour support as needed. If you have a minimum of 2 to 3 years of Quality Assurance experience with a proven track record of improving Data Quality, experience with SSIS, MSSQL, Snowflake, and DBT, knowledge of QA automation tools, ETL processes, familiarity with cloud computing, and data ecosystem on Snowflake, you would be a great fit for this role. Desirable qualifications include knowledge of Insurance Data & its processes, data validation experience between on-prem & cloud architecture, and familiarity with hybrid data ecosystems.,
Posted 3 days ago
4.0 - 10.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
At EY, you will have the opportunity to shape a career as unique as you are, supported by a global network, inclusive culture, and cutting-edge technology to help you reach your full potential. Your individual perspective and voice are valued to contribute to the continuous improvement of EY. By joining us, you can create an outstanding experience for yourself while contributing to a more efficient and inclusive working world for all. As a Data Engineering Lead, you will work closely with the Data Architect to design and implement scalable data lake architecture and data pipelines. Your responsibilities will include designing and implementing scalable data lake architectures using Azure Data Lake services, developing and maintaining data pipelines for data ingestion from various sources, optimizing data storage and retrieval processes for efficiency and performance, ensuring data security and compliance with industry standards, collaborating with data scientists and analysts to enhance data accessibility, monitoring and troubleshooting data pipeline issues to ensure reliability, and documenting data lake designs, processes, and best practices. You should have experience with SQL and NoSQL databases, as well as familiarity with big data file formats such as Parquet and Avro. **Roles and Responsibilities:** **Must Have Skills:** - Azure Data Lake - Azure Synapse Analytics - Azure Data Factory - Azure DataBricks - Python (PySpark, Numpy, etc.) - SQL - ETL - Data warehousing - Azure DevOps - Experience in developing streaming pipelines using Azure Event Hub, Azure Stream Analytics, Spark streaming - Experience in integrating with business intelligence tools such as Power BI **Good To Have Skills:** - Big Data technologies (e.g., Hadoop, Spark) - Data security **General Skills:** - Experience with Agile and DevOps methodologies and the software development lifecycle - Proactive and accountable for deliverables - Ability to identify and escalate dependencies and risks - Proficient in working with DevOps tools with limited supervision - Timely completion of assigned tasks and regular status reporting - Capability to train new team members - Desired knowledge of cloud solutions like Azure or AWS with DevOps/Cloud certifications - Ability to work effectively with multicultural global teams and virtually - Strong relationship-building skills with project stakeholders Join EY in its mission to build a better working world by creating long-term value for clients, people, and society, and fostering trust in the capital markets. Leveraging data and technology, diverse EY teams across 150+ countries provide assurance and support clients in growth, transformation, and operations across various sectors. Through its services in assurance, consulting, law, strategy, tax, and transactions, EY teams strive to address complex global challenges by asking insightful questions to discover innovative solutions.,
Posted 3 days ago
3.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You should have hands-on experience with Celonis EMS (Execution Management System) and possess strong SQL skills for data extraction, transformation, and modeling. Proficiency in PQL (Process Query Language) for custom process analytics is essential, along with experience in integrating Celonis with SAP, Oracle, Salesforce, or other ERP/CRM systems. Having knowledge of ETL, data pipelines, and APIs (REST/SOAP) is crucial for this role. You should also demonstrate expertise in Process Mining & Analytical Skills, including understanding of business process modeling, process optimization techniques, and at least one OCPM project experience. Your responsibilities will include analyzing event logs to identify bottlenecks, inefficiencies, and automation opportunities. With 6-10 years of experience in the IT industry, focusing on Data Architecture/Business Process, and specifically 3-4 years of experience in process mining, data analytics, or business intelligence, you should be well-equipped for this position. A Celonis certification (e.g., Celonis Data Engineer, Business Analyst, or Solution Consultant) would be a plus. Any additional OCPM experience is also welcomed. Candidates who can join within 30-45 days will be given priority consideration for this role.,
Posted 3 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are Allvue Systems, the leading provider of software solutions for the Private Capital and Credit markets. Whether a client wants an end-to-end technology suite, or independently focused modules, Allvue helps eliminate the boundaries between systems, information, and people. We’re looking for ambitious, smart, and creative individuals to join our team and help our clients achieve their goals. Working at Allvue Systems means working with pioneers in the fintech industry. Our efforts are powered by innovative thinking and a desire to build adaptable financial software solutions that help our clients achieve even more. With our common goals of growth and innovation, whether you’re collaborating on a cutting-edge project or connecting over shared interests at an office happy hour, the passion is contagious. We want all of our team members to be open, accessible, curious and always learning. As a team, we take initiative, own outcomes, and have passion for what we do. With these pillars at the center of what we do, we strive for continuous improvement, excellent partnership and exceptional results. Come be a part of the team that’s revolutionizing the alternative investment industry. Define your own future with Allvue Systems! Design, implement, and maintain data pipelines that handle both batch and real-time data ingestion. Integrate various data sources (databases, APIs, third-party data) into Snowflake and other data systems. Work closely with data scientists and analysts to ensure data availability, quality, and performance. Troubleshoot and resolve issues related to data pipeline performance, scalability, and integrity. Optimize data processes for speed, scalability, and cost efficiency. Ensure data governance and security best practices are implemented Possesses a total experience of 5 to 8 years, including over 4+ years of expertise in data engineering or related roles. Strong experience with Snowflake, Kafka, and Debezium. Proficiency in SQL, Python, and ETL frameworks. Experience with data warehousing, data modeling, and pipeline optimization. Strong problem-solving skills and attention to detail. Experience in the financial services or fintech industry is highly desirable
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The role of Sr Specialist Visualization & Automation in Hyderabad, India involves defining and driving the Platform engineering of Business intelligence solutions with a focus on Power BI technology. As a part of your responsibilities, you will use your strong Power BI skills to oversee the creation and management of BI and analytics solutions. You will be instrumental in driving the success of technology usage for solution delivery, best practices, standards definition, compliance, smooth transition to operations, improvements, and enablement of the business. Collaboration with the solution delivery lead and visualization lead on existing, new, and upcoming features, technology decisioning, and roadmap will be crucial. You will work closely with the solution architect and platform architect to define the visualization architecture pattern based on functional and non-functional requirements, considering available technical patterns. Additionally, you will define and drive the DevOps Roadmap to enable Agile ways of working, CI/CD pipeline, and automation for self-serve governance of the Power BI platform in collaboration with the Platform lead. It will be your accountability to ensure adherence with security and compliance policies and procedures, including Information Security & Compliance (ISC), Legal, ethics, and other compliance policies and procedures in defining architecture standards, patterns, and platform solutions. The role requires 8-10 years of IT experience in Data and Analytics, Visualization with a strong exposure to Power BI Solution delivery and Platform Automation in a global matrix organization. An in-depth understanding of database management systems, ETL, OLAP, data lake technologies, and experience in Power BI is essential. Knowledge of other visualization technologies is a plus. A specialization in the Pharma domain and understanding of data usage across the end-to-end enterprise value chain are advantageous. Good interpersonal, written and verbal communication skills, time management, and technical expertise aligned with Novartis Values & Behaviors are necessary. Join Novartis, a company committed to building an outstanding, inclusive work environment with diverse teams representative of the patients and communities served. Be a part of a mission to reimagine medicine and improve lives. If this role does not align with your career goals but you wish to stay connected with Novartis for future opportunities, join the Novartis Network. Explore the benefits, rewards, and the opportunity to create a brighter future together with Novartis.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a member of the infrastructure team at FIS, you will play a crucial role in troubleshooting and resolving technical issues related to Azure and SQL Server. Your responsibilities will include developing data solutions, understanding business requirements, and transforming data from different sources. You will design and implement ETL processes and collaborate with cross-functional teams to ensure that solutions meet business needs. To excel in this role, you should have a degree in Computer Science, a minimum of 4 years of experience, and proficient working knowledge of Azure, SQL, and ETL. Any programming language skills will be an asset, along with working knowledge of Data Warehousing, experience with JSON and XML data structures, and familiarity with working with APIs. At FIS, we offer a flexible and creative work environment where you can learn, grow, and make a real impact on your career. You will be part of a diverse and collaborative atmosphere, with access to professional and personal development resources. Additionally, you will have opportunities to volunteer and support charities, along with competitive salary and benefits. Please note that current and future sponsorship are not available for this position. FIS is committed to protecting the privacy and security of all personal information processed to provide services to our clients. For specific information on how FIS safeguards personal information online, please refer to the Online Privacy Notice. Recruitment at FIS primarily operates on a direct sourcing model, with a small portion of hires through recruitment agencies. FIS does not accept resumes from agencies not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings.,
Posted 3 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Senior Technical Lead at Barclays, where you'll have the opportunity to contribute to the evolution of our digital landscape, driving innovation and excellence. In this role, you will leverage cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. Your primary responsibility will be to deliver technology stack solutions, utilizing your strong analytical and problem-solving skills to understand business requirements and deliver high-quality solutions. Working collaboratively with fellow engineers, business analysts, and stakeholders, you will tackle complex technical issues that require detailed analytical skills and analysis. To excel as a Senior Technical Lead, you should possess experience in leading a team to perform complex tasks, using your professional knowledge and skills to deliver impactful work that influences the entire business function. You will be responsible for setting objectives, coaching employees to achieve those objectives, conducting performance appraisals, and determining reward outcomes. Whether you have leadership responsibilities or work as an individual contributor, you will lead collaborative assignments, guide team members, and identify new directions for projects to meet desired outcomes. Your role may involve consulting on complex issues, providing advice to leaders, identifying ways to mitigate risks, and developing new policies and procedures to enhance control and governance. You will take ownership of managing risks, strengthening controls, advising on decision-making, contributing to policy development, and ensuring operational effectiveness. Collaboration with other functions and business divisions will be essential to stay aligned with business strategies and activities. In addition to the above, some highly valued technical skills for this role include proficiency in ABINITIO and AWS, strong ETL and Data Integration background, experience in building complex ETL Data Pipelines, knowledge of data warehousing principles, Unix, SQL, basic AWS/Cloud Architecture, data modeling, ETL Scheduling, and strong data analysis skills. As a Senior Technical Lead, you will be assessed on key critical skills such as risk management, change and transformation, business acumen, strategic thinking, digital and technology expertise, along with job-specific technical skills. This role is based in Pune. **Purpose of the Role:** The purpose of this role is to design, develop, and enhance software using various engineering methodologies to provide business, platform, and technology capabilities for our customers and colleagues. **Key Accountabilities:** - Develop and deliver high-quality software solutions using industry-aligned programming languages, frameworks, and tools, ensuring scalability, maintainability, and performance optimization of the code. - Collaborate with product managers, designers, and engineers to define software requirements, devise solution strategies, and integrate software solutions with business objectives. - Participate in code reviews, promote a culture of code quality and knowledge sharing, and stay informed about industry technology trends to contribute to technical excellence. - Adhere to secure coding practices, implement effective unit testing practices, and ensure proper code design, readability, and reliability. **Assistant Vice President Expectations:** As an Assistant Vice President, you are expected to advise on decision-making, contribute to policy development, and ensure operational effectiveness. Collaboration with other functions and business divisions is crucial. Whether leading a team or working as an individual contributor, you will be accountable for delivering impactful work, coaching employees, and promoting a culture of excellence. **Barclays Values and Mindset:** All colleagues at Barclays are expected to embody the values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. The Senior Software Developer plays a crucial role in collaborating with the Product Manager, Implementation Consultants (ICs), and clients to understand requirements for meeting data analysis needs. This position requires good collaboration skills to provide guidance on analytics aspects to the team in various analytics-related activities. Experienced in Qlik Sense Architecture design and proficient in load script implementation and best practices. Hands-on experience in Qlik Sense development, dashboarding, data-modelling, and reporting techniques. Skilled in data integration through ETL processes from various sources and adept at data transformation including the creation of QVD files and set analysis. Capable of data modeling using Dimensional Modelling, Star schema, and Snowflake schema. The Senior Software Developer should possess strong SQL skills, particularly in SQL Server, to validate Qlik Sense dashboards and work on internal applications. Knowledge of deploying Qlik Sense applications using Qlik Management Console (QMC) is advantageous. Responsibilities include working with ICs, product managers, and clients to gather requirements, configuration, migration, and support of Qlik Sense applications, implementation of best practices, and staying updated on new technologies. Candidates for this role should hold a Bachelor of Science / BTech / MTech / Master of Science degree in Computer Science or equivalent work experience. Effective verbal and written communication skills are essential. Additionally, candidates are required to have a minimum of 3 - 5 years of experience in implementing end-to-end business intelligence using Qlik Sense, with thorough knowledge of Qlik Sense architecture, design, development, testing, and deployment processes. Understanding of Qlik Sense best practices, relational database concepts, data modeling, SQL code writing, and ETL procedures is crucial. Technical expertise in Qlik Sense, SQL Server, data modeling, and experience with clinical trial data and SDTM standards is beneficial. This position offers the opportunity to accelerate skills and career growth within a fast-growing company while contributing to the future of healthcare. eClinical Solutions fosters an inclusive culture that values diversity and encourages continuous learning and improvement. The company is an equal opportunity employer committed to making employment decisions based on qualifications, merit, culture fit, and business needs.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non-Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring knowledge of data engineering, cloud infrastructure, platform engineering, platform operations, and production support using ground-breaking cloud and big data technologies. The ideal candidate with 6-8 years of experience will possess strong technical skills, an eagerness to learn, a keen interest in Financial Crime, Financial Risk, and Compliance technology transformation, the ability to work collaboratively in a fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role, you will: - Ingest and provision raw datasets, enriched tables, and curated, re-usable data assets to enable a variety of use cases. - Drive improvements in the reliability and frequency of data ingestion, including increasing real-time coverage. - Support and enhance data ingestion infrastructure and pipelines. - Design and implement data pipelines to collect data from disparate sources across the enterprise and external sources and deliver it to the data platform. - Implement Extract Transform and Load (ETL) workflows, ensuring data availability at each stage in the data flow. - Identify and onboard data sources, conduct exploratory data analysis, and evaluate modern technologies, frameworks, and tools in the data engineering space. Core/Must-Have skills: - 3-8 years of expertise in designing and implementing data warehouses, data lakes using Oracle Tech Stack (ETL: ODI, SSIS, DB: PLSQL, and AWS Redshift). - Experience in managing data extraction, transformation, and loading various sources using Oracle Data Integrator and other tools like SSIS. - Database Design and Dimension modeling using Oracle PLSQL, Microsoft SQL Server. - Advanced working SQL Knowledge and experience with relational and NoSQL databases. - Strong analytical and critical thinking skills, expertise in data Modeling and DB Design, and experience building and optimizing data pipelines. Good to have: - Experience in Financial Crime, Financial Risk, and Compliance technology transformation domains. - Certification on any cloud tech stack preferred Microsoft Azure. - In-depth knowledge and hands-on experience with data engineering, Data Warehousing, and Delta Lake on-prem and cloud platforms. - Ability to script, code, query, and design systems for maintaining Azure/AWS Lakehouse, ETL processes, business Intelligence, and data ingestion pipelines. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 3 days ago
0.0 - 3.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled Technical Data Analyst with expertise in Oracle PL/SQL and Python, well-versed in data analysis tools and techniques. Your role involves leading and mentoring a team of data analysts to derive data-driven insights and contribute to key business decisions. Additionally, you will research and evaluate emerging AI tools for potential application in data analysis projects. Your responsibilities will include designing, developing, and maintaining complex Oracle PL/SQL queries and procedures for ETL processes. You will utilize Python scripting for data analysis, automation, and reporting, as well as perform in-depth data analysis to provide actionable insights for improving business performance. Collaborating with cross-functional teams, you will translate business requirements into technical specifications and maintain data quality standards across systems. Moreover, you will leverage data analysis and visualization tools like Tableau, Power BI, and Qlik Sense to create interactive dashboards and reports for stakeholders. It is essential to stay updated with the latest data analysis tools, techniques, and industry best practices, including advancements in AI/ML. You will also research and evaluate emerging AI/ML tools for potential application in data analysis projects. Preferred qualifications for this role include hands-on experience as a Technical Data Analyst with expertise in Oracle PL/SQL and Python, proficiency in Python scripting, and familiarity with data visualization tools like Tableau, Power BI, or Qlik Sense. Additionally, awareness of AI/ML tools and techniques in data analytics and practical experience applying these techniques in projects will be advantageous. Strong analytical, problem-solving, communication, and interpersonal skills are essential, along with experience in the financial services industry. Qualifications for this position include 0-2 years of relevant experience, programming/debugging skills used in business applications, knowledge of industry standards, specific business areas for application development, and program languages. Clear and concise written and verbal communication is consistently demonstrated. Education requirements include a Bachelor's degree or equivalent experience. This job description offers a high-level overview of the work involved, and additional job-related duties may be assigned as needed. Mandatory skills required for this role are Ab Initio, Oracle PL/SQL, and Unix/Linux, with a minimum of 2 years of hands-on Development experience.,
Posted 3 days ago
3.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Specialist, Technical Professional Services (ETL Programming) at Fiserv, you will be responsible for designing large and complex Data Migration, Data Warehousing, and Business Intelligence Solutions under general supervision. Your role will involve analyzing conversion requirements, interpreting clients" existing systems, and taking complete ownership of the technical delivery of assigned conversion/implementation projects. Additionally, you will manage multiple clients, adhere to project timelines, and monitor project progress by tracking activities and resolving issues. You will be expected to assist management in planning and designing improvements to business processes, utilize your problem-solving skills to resolve moderately complex issues, and communicate progress and potential problems to the Project Manager. Your responsibilities will also include maintaining tools for ensuring the efficiency and effectiveness of the conversion process, providing post-implementation support for 2 weeks, and working in US shift timings based on client requirements. To qualify for this role, you should hold a B. Tech/MCA/MSc (CS/IT) degree and have 3 to 10 years of experience in the IT industry. You must possess excellent programming skills in SQL/SSIS, a good understanding of ETL, and knowledge of activities performed in the conversion/implementation of core Banking applications. Additionally, experience supporting Banking Core Conversions and familiarity with Account Processing core systems would be advantageous. Ideal candidates will have exposure to the Banking and Financial Services industry, including a good understanding of Banking Products, Services & Procedures. Proficiency in Mainframe/COBOL/JCL, strong analytical skills, leadership abilities, and proficiency with Excel are also desirable skills for this role. At Fiserv, we are committed to diversity and inclusion and provide reasonable accommodations for individuals with disabilities during the job application and interview process. We caution applicants against fraudulent job postings not affiliated with Fiserv and recommend reporting any suspicious activity to local law enforcement authorities. If you are interested in joining Fiserv, please apply using your legal name, complete the step-by-step profile, and attach your resume to be considered for this Specialist position in Technical Professional Services (ETL Programming).,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
A career within Functional and Industry Technologies services will provide you with the opportunity to build secure and new digital experiences for customers, employees, and suppliers. We focus on improving apps or developing new apps for traditional and mobile devices as well as conducting usability testing to find ways to improve our clients" user experience. As part of our team, you'll help clients harness technology systems in financial services focusing on areas such as insurance, sales performance management, retirement and pension, asset management, and banking & capital markets. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies, and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. Responsibilities As a Senior Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: - Use feedback and reflection to develop self-awareness, personal strengths and address development areas. - Delegate to others to provide stretch opportunities, coaching them to deliver results. - Demonstrate critical thinking and the ability to bring order to unstructured problems. - Use a broad range of tools and techniques to extract insights from current industry or sector trends. - Review your work and that of others for quality, accuracy, and relevance. - Know how and when to use tools available for a given situation and can explain the reasons for this choice. - Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. - Use straightforward communication, in a structured way, when influencing and connecting with others. - Able to read situations and modify behavior to build quality relationships. - Uphold the firm's code of ethics and business conduct. Years of Experience - 2 to 5 years of experience Education Qualification: BTech/BE/MTech/MS/MCA Preferred Skill Set/Roles and Responsibility - - Hands-on Experience in P&C Insurance on Guidewire DataHub/InfoCenter Platform. - Experience in mapping Guidewire Insurance Suite of products (PC/BC/CC/CM) to DHIC. - Works with business in identifying detailed analytical and operational reporting/extracts requirements. - Able to create Microsoft SQL / ETL / SSIS complex queries. - Participates in Sprint development, test, and integration activities. - Creates detailed source to target mappings. - Creates and validates data dictionaries - Writes and validates data translation and migration scripts. - Communicating with business to gather business requirements. - Performs GAP analysis between existing (legacy) and new (GW) data related solutions. - Working with Informatica ETL devs. - Knowledge of Cloud AWS,
Posted 3 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This is a data engineer position - a programmer responsible for the design, development implementation and maintenance of data flow channels and data processing systems that support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. The overall objective is defining optimal solutions to data collection, processing, and warehousing. Must be a Spark Java development expertise in big data processing, Python and Apache spark particularly within banking & finance domain. He/She designs, codes and tests data systems and works on implementing those into the internal infrastructure. Responsibilities: Ensuring high quality software development, with complete documentation and traceability Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance Ensure efficient data storage and retrieval using Big Data Implement best practices for spark performance tuning including partition, caching and memory management Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins) Work on batch processing frameworks for Market risk analytics Promoting unit/functional testing and code inspection processes Work with business stakeholders and Business Analysts to understand the requirements Work with other data scientists to understand and interpret complex datasets Qualifications: 5- 8 Years of experience in working in data eco systems. 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks. 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning) Experienced in working with large and multiple datasets and data warehouses Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets. Strong analytic skills and experience working with unstructured datasets Ability to effectively use complex analytical, interpretive, and problem-solving techniques Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira Experience with external cloud platform such as OpenShift, AWS & GCP Experience with container technologies (Docker, Pivotal Cloud Foundry) and supporting frameworks (Kubernetes, OpenShift, Mesos) Experienced in integrating search solution with middleware & distributed messaging - Kafka Highly effective interpersonal and communication skills with tech/non-tech stakeholders. Experienced in software development life cycle and good problem-solving skills. Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environment Education: Bachelor’s/University degree or equivalent experience in computer science, engineering, or similar domain ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Architecture ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 days ago
5.0 years
0 Lacs
Greater Chennai Area
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in problem-solving discussions and contribute to the overall success of the projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration and ETL processes. - Familiarity with cloud platforms and services related to application development. - Ability to write efficient and scalable code. Additional Information: - The candidate should have minimum 5 years of experience in PySpark. - This position is based at our Chennai office. - A 15 years full time education is required.
Posted 3 days ago
5.0 years
0 Lacs
Greater Chennai Area
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that the applications developed meet both user needs and technical specifications. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team skills and capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration and ETL processes. - Familiarity with cloud platforms and services related to data processing. - Ability to mentor junior team members and provide technical guidance. Additional Information: - The candidate should have minimum 7.5 years of experience in PySpark. - This position is based in Pune. - A 15 years full time education is required.
Posted 3 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Aruba is an HPE Company, and a leading provider of next-generation network access solutions for the mobile enterprise. Helping some of the largest companies in the world modernize their networks to meet the demands of a digital future, Aruba is redefining the “Intelligent Edge” – and creating new customer experiences across intelligent spaces and digital workspaces. Join us redefine what’s next for you. How You Will Make Your Mark… The ideal candidate will have experience with deploying and managing enterprise-scale Data Governance practices along with Data Engineering experience developing the database layer to support and enable AI initiatives as well as streamlined user experience with Data Discovery, Security & Access Control, for meaningful & business-relevant analytics. The candidate will be comfortable with the full stack analytics ecosystem, with Database layer, BI dashboards, and AI/Data Science models & solutions, to effectively define and implement a scalable Data Governance practice. What You’ll Do Responsibilities: Drive the design and development of Data Dictionary, Lineage, Data Quality, Security & Access Control for Business-relevant data subjects & reports across business domains. Engage with the business users community to enable ease of Data Discovery and build trust in the data through Data Quality & Reliability monitoring with key metrics & SLAs defined. Supports the development and sustaining of Data subjects in the Database layer to enable BI dashboards and AI solutions. Drives the engagement and alignment with the HPE IT/CDO team on Governance initiatives, including partnering with functional teams across the business. Test, validate and assure the quality of complex AI-powered product features. Partner with a highly motivated and talented set of colleagues. Be a motivated, self-starter who can operate with minimal handholding. Collaborate across teams and time zones, demonstrating flexibility and accountability Education And Experience Required 7+ years of Data Governance and Data Engineering experience, with significant exposure to enabling Data availability, data discovery, quality & reliability, with appropriate security & access controls in enterprise-scale ecosystem. First level university degree. What You Need To Bring Knowledge and Skills: Experience working with Data governance & metadata management tools (Collibra, Databricks Unity Catalog, Atlan, etc.). Subject matter expertise of consent management concepts and tools. Demonstrated knowledge of research methodology and the ability to manage complex data requests. Excellent analytical thinking, technical analysis, and data manipulation skills. Proven track record of development of SQL SSIS packages with ETL flow. Experience with AI application deployment governance a plus. Technologies such as MS SQL Server, Databricks, Hadoop, SAP S4/HANA. Experience with SQL databases and building SSIS packages; knowledge of NoSQL and event streaming (e.g., Kafka) is a bonus. Exceptional interpersonal skills and written communication skills. Experience and comfort solving problems in an ambiguous environment where there is constant change. Ability to think logically, communicate clearly, and be well organized. Strong knowledge of Computer Science fundamentals. Experience working with LLMs and generative AI frameworks (e.g., OpenAI, Hugging Face, etc.). Proficiency in MS Power Platform, Java, Scala, Python experience preferred. Strong collaboration and communication skills. Performing deep-dive investigations, including applying advanced techniques, to solve some of the most critical and complex business problems in support of business transformation to enable Product, Support, and Software as a Service offerings. Strong business acumen and technical knowledge within area of responsibility. Strong project management skills Additional Skills Accountability, Accountability, Active Learning (Inactive), Active Listening, Bias, Business Decisions, Business Development, Business Metrics, Business Performance, Business Strategies, Calendar Management, Coaching, Computer Literacy, Creativity, Critical Thinking, Cross-Functional Teamwork, Design Thinking, Empathy, Follow-Through, Growth Mindset, Intellectual Curiosity (Inactive), Leadership, Long Term Planning, Managing Ambiguity, Personal Initiative {+ 5 more} What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #aruba Job Business Planning Job Level Specialist HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.
Posted 3 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary: We are seeking a Senior Python Developer with a strong background in backend development and a passion for designing and implementing efficient algorithms. The ideal candidate will be responsible for developing, maintaining, and optimizing our core backend systems and services, with a particular focus on complex algorithms .This role requires a deep understanding of Python, strong problem-solving skills, and the ability to work collaboratively in a fast-paced environment. You will play a key role in designing, developing, and maintaining robust data pipelines, APIs, and data processing workflows. You will work closely with data analysts and business teams to understand data requirements and deliver insightful data-driven solutions. The ideal candidate is passionate about data, enjoys problem-solving, and thrives in a collaborative environment. Experience in the financial or banking domain is a plus. Responsibilities: Design, develop, and maintain robust and scalable data pipelines using Python, SQL, PySpark, and streaming technologies like Kafka. Perform efficient data extraction, transformation, and loading (ETL) for large volumes of data from diverse data providers, ensuring data quality and integrity. Build and maintain RESTful APIs and microservices to support seamless data access and transformation workflows. Develop reusable components, libraries, and frameworks to automate data processing workflows, optimizing for performance and efficiency. Apply statistical analysis techniques to uncover trends, patterns, and actionable business insights from data. Implement comprehensive data quality checks and perform root cause analysis on data anomalies, ensuring data accuracy and reliability. Collaborate effectively with data analysts, business stakeholders, and other engineering teams to understand data requirements and translate them into technical solutions. Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. 5+ years of proven experience in Python development, with a strong focus on data handling, processing, and analysis. Extensive experience building and maintaining RESTful APIs and working with microservices architectures. Proficiency in building and managing data pipelines using APIs, ETL tools, and Kafka. Solid understanding and practical application of statistical analysis methods for business decision-making. Hands-on experience with PySpark for large-scale distributed data processing. Strong SQL skills for querying, manipulating, and optimizing relational database operations. Deep understanding of data cleaning, preprocessing, and validation techniques. Knowledge of data governance, security, and compliance standards is highly desirable. Experience in the financial services industry is a plus. Familiarity with basic machine learning (ML) concepts and experience preparing data for ML models is a plus. Strong analytical, debugging, problem-solving, and communication skills. Ability to work both independently and collaboratively within a team environment. Preferred Skills: Experience with CI/CD tools and Git-based version control. Experience in the financial or banking domain. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in continuous learning to stay updated with the latest technologies and methodologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development. - Strong understanding of data warehousing concepts and best practices. - Experience with ETL processes and data integration techniques. - Familiarity with reporting tools and data visualization techniques. - Ability to troubleshoot and optimize existing applications for better performance. Additional Information: - The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development. - This position is based at our Pune office. - A 15 years full time education is required., 15 years full time education
Posted 3 days ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At PwC, our teams in Scaled Engineering Services are dedicated to delivering reliable, scalable, and cost-effective technology solutions that enable clients to achieve operational excellence and business agility. These teams apply technical expertise and a strong service-oriented mindset to support the design, development, deployment, and maintenance of enterprise-grade IT systems and applications. Professionals in engineering management roles will design and implement technology solutions to meet clients' business needs. You will leverage your experience in analysing requirements, developing technical designs to enable the successful delivery of solutions. Driven by a passion for engineering excellence and scalable design, you lead with purpose, guiding teams to deliver high-quality solutions that meet complex client needs. You take ownership of defining reliable solutions ensuring technical integrity and enabling successful delivery through hands-on leadership and mentorship. By leveraging your expertise in backend systems, cloud-native technologies, and infrastructure best practices, you coach team members, maximize their strengths, and foster a culture of continuous improvement—all while aligning solutions with business goals and delivery commitments. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Take ownership of technical solutions and technical delivery, ensuring successful planning, estimation, execution, and deployment of complex IT initiatives. Collaborate with project and engineering leads to drive shared accountability for technical quality, delivery timelines, and overall solution outcomes. Mentor and guide developers and engineers, sharing solution insights and promoting best practices across the team. Use code and design reviews as a platform to elevate team capabilities and encourage continuous technical growth. Proactively address technical challenges, leading difficult conversations with clarity and professionalism, and escalating issues when necessary. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Software and Product Innovation team you will lead the development of modern applications and drive innovative engineering solutions. As a Manager you will oversee engineering teams, fostering a culture of ownership and continuous improvement while delivering reliable and scalable solutions. This role offers the chance to work with advanced technologies and collaborate with cross-functional teams to deliver impactful projects that align with business goals. Responsibilities Lead the development and implementation of modern applications Supervise engineering teams to promote successful project execution Foster a culture of accountability and continuous enhancement Collaborate with cross-functional teams to align projects with business objectives Utilize advanced technologies to drive innovative engineering solutions Assure the delivery of scalable and dependable solutions Mentor team members to encourage professional growth and development Analyze project outcomes to identify areas for improvement What You Must Have Bachelor's Degree in Computer Science, Information Technology, Engineering 12 years of experience Strong hands-on experience in building modern applications using Java, .NET, Node.js, Python, or similar technologies Solid understanding of data engineering concepts such as data pipelines, ETL workflows, and working with large datasets Good grasp of cloud-native solutions and experience with platforms like AWS, Azure, or GCP Strong understanding of DevOps principles including CI/CD, infrastructure as code, containerization, and monitoring Proven leadership in managing engineering teams and delivering reliable, scalable solutions Ability to work in agile environments and manage delivery across sprints and releases Familiarity with solution design, API integration, and secure development practices Ability to engage with technical and solution architects to understand design goals and contribute practical implementation insights Excellent communication, team mentoring, and problem-solving skills Oral and written proficiency in English required What Sets You Apart Bachelor’s Degree in Engineering or Technology with relevant specialization Proven experience in software delivery ownership Overseeing data engineering project implementations Guiding DevOps practices for system reliability Collaborating with architects on design validation Leading engineering teams with technical guidance Managing cross-functional coordination for project delivery Monitoring project health and delivery metrics Driving a culture of accountability and improvement
Posted 3 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : Btech Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing innovative solutions to enhance user experience and streamline processes. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement software solutions to meet business requirements. - Collaborate with team members to design and optimize applications. - Troubleshoot and debug applications to ensure optimal performance. - Stay updated on industry trends and technologies to enhance application development processes. - Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery. - Strong understanding of data analytics and data modeling. - Experience with cloud-based data warehousing solutions. - Hands-on experience in SQL and database management. - Knowledge of ETL processes and data integration techniques. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Chennai office. - A Btech degree is required., Btech
Posted 3 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, PySpark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to adapt to evolving technologies and methodologies. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Collaborate with cross-functional teams to ensure alignment on project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, PySpark. - Strong understanding of data processing and analytics. - Experience with cloud-based data solutions and architectures. - Familiarity with data integration and ETL processes. - Ability to troubleshoot and optimize application performance. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Hyderabad office. - A 15 years full time education is required., 15 years full time education
Posted 3 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, MySQL Good to have skills : PySpark Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in troubleshooting and optimizing application performance, while actively participating in discussions to enhance project outcomes and drive innovation. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark, Databricks Unified Data Analytics Platform, MySQL. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration and ETL processes. - Familiarity with cloud platforms and services related to data analytics. - Ability to write efficient SQL queries for data manipulation and retrieval. Additional Information: - The candidate should have minimum 3 years of experience in PySpark. - This position is based at our Hyderabad office. - A 15 years full time education is required., 15 years full time education
Posted 3 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Apache Airflow Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge. - Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Airflow, Python (Programming Language). - Strong understanding of data integration and ETL processes. - Experience with cloud-based data solutions and architectures. - Familiarity with data governance and management best practices. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Kolkata office. - A 15 years full time education is required., 15 years full time education
Posted 3 days ago
5.0 years
0 Lacs
India
Remote
Position : Data Engineer ( contract position ) Location : Remote Duration : 6 month contract with extensions Pay range: $15 USD per hour (40 hours per week) Insight Global is seeking a skilled Data Engineer to design, build, and maintain our data processing systems using Azure cloud technologies. The ideal candidate will leverage Databricks, Azure Data Factory (ADF), and other Azure services to transform raw data into valuable business insights while ensuring scalability, reliability, and performance of our data infrastructure. Key Responsibilities • Design and implement end-to-end data pipelines using Azure Data Factory and Databricks • Develop, optimize, and maintain ETL/ELT processes to ingest data from various sources • Write efficient SQL queries and stored procedures for data extraction and transformation • Create and maintain Databricks notebooks using Python for data processing and analysis • Implement data quality checks and monitoring solutions • Collaborate with data scientists and analysts to understand data requirements • Participate in code reviews and maintain documentation • Implement CI/CD pipelines for data solutions using GitHub • Optimize data models for performance, scalability, and cost-efficiency • Ensure data security and compliance with organizational policies REQUIRED SKILLS AND EXPERIENCE • Bachelors, Master's or Ph.D. degree in Computer Science, Mathematics, Statistics, or related field • 5+ years of experience in data engineering roles • Strong proficiency in SQL for complex data manipulation and querying • Python and PySpark programming expertise with focus on data processing libraries • Hands-on experience with Azure Databricks and Delta Lake • Experience building and orchestrating data pipelines with Azure Data Factory • Familiarity with Azure services such as Azure SQL, Azure Storage, Azure Synapse Analytics • Experience with version control using GitHub and CI/CD practices • Knowledge of data modeling techniques and best practices • Understanding of data warehousing concepts and dimensional modeling • Experience with data large-scale data processing NICE TO HAVE SKILLS AND EXPERIENCE • Knowledge of Azure DevOps for pipeline automation • Familiarity with real-time data processing technologies • Experience with NoSQL databases • Azure certifications (Data Engineer, Solutions Architect) • Experience with data visualization tools (Power BI, Tableau) • Understanding of data governance and compliance requirements
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France