Jobs
Interviews

167 Impala Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be required to monitor and control all phases of the development process, provide user and operational support on applications to business users, and recommend and develop security measures in post-implementation analysis. As the Applications Development Senior Programmer Analyst, you will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business and system processes, recommend advanced programming solutions, and ensure that essential procedures are followed. Additionally, you will serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and act as a subject matter expert to senior stakeholders and other team members. To qualify for this role, you should have 8-12 years of relevant experience in systems analysis and programming of software applications, managing and implementing successful projects, and working knowledge of consulting/project management techniques/methods. You should also have the ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements. A Bachelor's degree or equivalent experience is required for this position. In addition to the general job description, the ideal candidate should have 8 to 12 years of Application development experience through the full lifecycle with expertise in UI architecture patterns such as Micro Frontend and NX. Proficiency in Core Java/J2EE Application, Data Structures, Algorithms, Hadoop, Map Reduce Framework, Spark, YARN, and other relevant technologies is essential. Experience with Big Data Spark ecosystem, ETL, BI tools, agile environment, test-driven development, and optimizing software solutions for performance and stability is also preferred. This job description provides an overview of the responsibilities and qualifications for the Applications Development Senior Programmer Analyst role. Other job-related duties may be assigned as required.,

Posted 18 hours ago

Apply

5.0 - 12.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You should have 5-12 years of experience in Big Data & Data related technologies. Your expertise should include a deep understanding of distributed computing principles and strong knowledge of Apache Spark. Proficiency in Python programming is required, along with experience using technologies such as Hadoop v2, Map Reduce, HDFS, Sqoop, Apache Storm, and Spark-Streaming for building stream-processing systems. You should have a good understanding of Big Data querying tools like Hive and Impala, as well as experience in integrating data from various sources such as RDBMS, ERP, and Files. Knowledge of SQL queries, joins, stored procedures, and relational schemas is essential. Experience with NoSQL databases like HBase, Cassandra, and MongoDB, along with ETL techniques and frameworks, is also expected. The role requires performance tuning of Spark Jobs, experience with AZURE Databricks, and the ability to efficiently lead a team. Designing and implementing Big Data solutions, as well as following AGILE methodology, are key aspects of this position.,

Posted 21 hours ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You will be working as an Informatica BDM professional at PibyThree Consulting Pvt Ltd. in Pune, Maharashtra. PibyThree is a global cloud consulting and services provider, focusing on Cloud Transformation, Cloud FinOps, IT Automation, Application Modernization, and Data & Analytics. The company's goal is to help businesses succeed by leveraging technology for automation and increased productivity. Your responsibilities will include: - Having a minimum of 4+ years of development and design experience in Informatica Big Data Management - Demonstrating excellent SQL skills - Working hands-on with HDFS, HiveQL, BDM Informatica, Spark, HBase, Impala, and other big data technologies - Designing and developing BDM mappings in Hive mode for large volumes of INSERT/UPDATE - Creating complex ETL mappings using various transformations such as Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, Lookups, Filters, Sequence, Router, and Update Strategy - Ability to debug Informatica and utilize tools like Sqoop and Kafka This is a full-time position that requires you to work in-person during day shifts. The preferred education qualification is a Bachelor's degree, and the preferred experience includes a total of 4 years of work experience with 2 years specifically in Informatica BDM.,

Posted 2 days ago

Apply

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have 5-12 years of experience in Big Data & Data related technologies, with expertise in distributed computing principles. Your skills should include an expert level understanding of Apache Spark and hands-on programming with Python. Proficiency in Hadoop v2, Map Reduce, HDFS, and Sqoop is required. Experience in building stream-processing systems using technologies like Apache Storm or Spark-Streaming, as well as working with messaging systems such as Kafka or RabbitMQ, will be beneficial. A good understanding of Big Data querying tools like Hive and Impala, along with integration of data from multiple sources including RDBMS, ERP, and Files, is necessary. You should possess knowledge of SQL queries, joins, stored procedures, and relational schemas. Experience with NoSQL databases like HBase, Cassandra, and MongoDB, along with ETL techniques and frameworks, is expected. Performance tuning of Spark Jobs and familiarity with native Cloud data services like AWS or AZURE Databricks is essential. The role requires the ability to efficiently lead a team, design and implement Big data solutions, and work as a practitioner of AGILE methodology. This position falls under the category of Data Engineer and is suitable for individuals with expertise in ML/AI Engineers, Data Scientists, and Software Engineers.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Are you intellectually curious and passionate about promoting solutions across organizational boundaries Join the Consumer & Community Banking (CCB) Stress Testing Transformation team for a dynamic opportunity to design and build creative solutions for the future of stress testing and annual CCAR exercises. As a Senior Associate in the Stress Testing Transformation Solution team, you will be a strategic thinker who is passionate about designing and building creative solutions for the future of Stress Testing. You will spend your time solving complex problems, demonstrating strategic thought leadership, and designing the way our stakeholders operate. By leveraging a deep understanding of CCB Stress Testing processes and extensive Finance domain knowledge, you will build scalable solutions that optimize process efficiencies, use data assets effectively, and advance platform capabilities. Responsibilities: - Collaborate with cross-functional teams to lead the design and implementation of end-to-end solutions for Stress Testing, addressing business problems with various technical solutions. - Provide expertise in process re-engineering and guidance based on the roadmap for large-scale Stress Testing transformation initiatives. - Assess, challenge, and provide solutions for Stress Testing processes, focusing on data sources, with the ability to influence and drive the roadmap. - Evaluate, recommend, and develop solutions and architecture, including integration with APIs, Python, AI/ML technology, and other enterprise applications. - Leverage data and best-in-class tools to improve processes and controls, enable cross-business applications, and embrace a consistent framework. - Simplify complex issues into manageable steps and achievements. - Eliminate manual reporting, reengineer processes, and increase the ability to generate insights faster through an integrated data and platform approach. Required Qualifications: - Bachelor's degree in engineering or a related field. - Experience with business intelligence, analytics, and data wrangling tools such as Alteryx, SAS, or Python. - Experience with relational databases, optimizing SQL to extract and summarize large datasets, report creation, and ad-hoc analyses. - Experience with Hive, Spark SQL, Impala, or other big-data query tools. - Ability to understand the underlying business context beyond raw data and identify business opportunities hidden in data. - Collaborative skills to work with global teams in a fast-paced, results-driven environment. - Strong problem-solving and analytical skills with a transformation mindset. Preferred Qualifications: - Experience with Databricks, SQL, Python, or other data platforms. - 8+ years of experience in Analytics Solution and Data Analytics, preferably related to the financial services domain.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

rajasthan

On-site

You will be working as a Snowflake Database Administrator at the Mid Level, providing database and application administration and support for the Information Management Analytical Service. This role involves managing data integration, data warehouse, and business intelligence, including enterprise reporting, predictive analytics, data mining, and self-service solutions. You will collaborate with different teams to offer database and application administration, job scheduling/execution, and code deployment support. Your key responsibilities will include providing database support for Big Data tools, performing maintenance tasks, performance tuning, monitoring, developer support, and administrative support for the application toolset. You will participate in a 24/7 on-call rotation for enterprise job scheduler activities, follow ITIL processes, create/update technical documentation, install/upgrade/configure application toolset, and ensure regular attendance. To qualify for this role, you are required to have a Bachelor's degree or equivalent experience, along with 5 years of work experience in IT. You should have experience in Cloud Database Administration, installing/configuring commercial applications at the OS level, and effective collaboration in a team environment. Preferred skills include scripting in Linux and Windows, experience with Terraform, and knowledge of the insurance and/or reinsurance industry. In terms of technical requirements, you should be proficient in databases such as Snowflake, Vertica, Impala, PostgreSQL, Oracle, SQL Server, operating systems like Unix, Linux, CentOS, Windows, and reporting tools including SAP Business Objects, Tableau, and PowerBI. This position falls under SOW#23 - Snowflake DBA and requires a minimum of 4 years to a maximum of 5 years of experience. Thank you for considering this opportunity.,

Posted 3 days ago

Apply

2.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Hadoop Developer, you will be responsible for developing a semantic model in the data lake to centralize transformation workflows currently managed in Qlik. Your expertise in data modeling, ETL pipeline development, and performance optimization will play a crucial role in enabling seamless data consumption for analytics and reporting purposes. Your key responsibilities will include translating Qlik-specific transformation logic into Hadoop/Impala-based processing, developing modular and reusable transformation layers to enhance scalability and flexibility, optimizing the semantic layer for high performance, and ensuring seamless integration with dashboarding tools. Additionally, you will design and implement ETL pipelines using Python to streamline data ingestion, transformation, and storage processes. Collaboration with data analysts, BI teams, and business stakeholders will be essential to align the semantic model with reporting requirements. Furthermore, you will be responsible for monitoring, troubleshooting, and enhancing data processing workflows to ensure reliability and efficiency. The ideal candidate for this position will possess strong experience in Hadoop, Impala, and distributed data processing frameworks. Proficiency in Python for ETL pipeline development and automation is required, along with a good command of SQL and performance tuning for large-scale datasets. Knowledge of data modeling principles and best practices for semantic layers, familiarity with Qlik transformation logic, and the ability to translate it into scalable processing are also essential. Additionally, familiarity with big data performance tuning and optimization strategies, strong problem-solving skills, and the ability to work in a fast-paced environment are key qualifications for this role. If you have the required skills and qualifications and are interested in this opportunity, please forward your updated resume to vidhya@thinkparms.in.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Principal Analyst at Citi's Analytics and Information Management (AIM) team in Bangalore, India, you will play a crucial role in creating client-centric analytical solutions for various business challenges. With a focus on client obsession and stakeholder management, you will be responsible for owning and delivering complex analytical projects. Your expertise in business context understanding, data analysis, and project management will be essential in identifying trends, patterns, and presenting high-quality solutions to senior management. Your primary responsibilities will include developing business critical dashboards, assessing and optimizing marketing programs, sizing the impact of strategic changes, and streamlining existing processes. By leveraging your skills in SQL, Python, Pyspark, Hive, and Impala, you will work with large datasets to extract insights that drive revenue growth and business decisions. Additionally, your experience in Investment Analytics, Retail Analytics, Credit Cards, and Financial Services will be valuable in delivering actionable intelligence to business leaders. To excel in this role, you should possess a master's or bachelor's degree in Engineering, Technology, or Computer Science from premier institutes, along with 5-6 years of experience in delivering analytical solutions. Your ability to articulate and solve complex business problems, along with excellent communication and interpersonal skills, will be key in collaborating with cross-functional teams and stakeholders. Moreover, your hands-on experience in Tableau and project management skills will enable you to mentor and guide junior team members effectively. If you are passionate about data, eager to tackle new challenges, and thrive in a dynamic work environment, this position offers you the opportunity to contribute to Citi's mission of enabling growth and economic progress through innovative analytics solutions. Join us in driving business success and making a positive impact on the financial services industry. Citi is an equal opportunity and affirmative action employer, offering full-time employment in the field of Investment Analytics, Retail Analytics, Credit Cards, and Financial Services. If you are ready to take your analytics career to the next level, we invite you to apply and be part of our global community at Citi.,

Posted 3 days ago

Apply

10.0 - 15.0 years

12 - 20 Lacs

Pune

Hybrid

Database Developer Company:Kiya.ai Work Location: Pune Work Mode:Hybrid JD: DataStrong knowledge of and hands-on development experience in Oracle PLSQL - Strong knowledge of and hands-on development *** experience SQL analytic functions*** - Experience with *** developing complex, numerically-intense business logic *** - Good knowledge of & experience in database performance tuning - Fluency in UNIX scripting Good-to-have - Knowledge of/experience in any of python, Hadoop/Hive/Impala, horizontally scalable databases, columnar databases - Oracle certifications - Any of DevOps tools/techniques CICD, Jenkins/GitLab, source control/git, deployment automation such Liquibase - Experience with Productions issues/deployments **Interested candidates drop your resume to saarumathi.r@kiya.ai **

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. The financial services practice at EY offers integrated advisory services to financial institutions and other capital markets participants. Within EY's Advisory Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for Senior and Manager Big Data Experts with expertise in the Financial Services domain and hands-on experience with the Big Data ecosystem. Expertise in Data engineering, including design and development of big data platforms. Deep understanding of modern data processing technology stacks such as Spark, HBase, and other Hadoop ecosystem technologies. Development using SCALA is a plus. Deep understanding of streaming data architectures and technologies for real-time and low-latency data processing. Experience with agile development methods, including core values, guiding principles, and key agile practices. Understanding of the theory and application of Continuous Integration/Delivery. Experience with NoSQL technologies and a passion for software craftsmanship. Experience in the Financial industry is a plus. Nice to have skills include understanding and familiarity with all Hadoop Ecosystem components, Hadoop Administrative Fundamentals, experience working with NoSQL in data stores like HBase, Cassandra, MongoDB, HDFS, Hive, Impala, schedulers like Airflow, Nifi, experience in Hadoop clustering, and Auto scaling. Developing standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Defining and developing client-specific best practices around data management within a Hadoop environment on Azure cloud. To qualify for the role, you must have a BE/BTech/MCA/MBA degree, a minimum of 3 years hands-on experience in one or more relevant areas, and a total of 6-10 years of industry experience. Ideally, you'll also have experience in Banking and Capital Markets domains. Skills and attributes for success include using an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates, strong communication, presentation and team building skills, experience in producing high-quality reports, papers, and presentations, and experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment, an opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide, and opportunities to work with EY Advisory practices globally with leading businesses across a range of industries. Working at EY offers inspiring and meaningful projects, education and coaching alongside practical experience for personal development, support, coaching, and feedback from engaging colleagues, opportunities to develop new skills and progress your career, freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 4 days ago

Apply

3.0 - 7.0 years

13 - 18 Lacs

Pune

Work from Office

About The Role : Job Title Technical-Specialist Big Data (PySpark) Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Python and Spark technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Big Data platform for at least 5 years. Hands own experience in Spark (Hive, Impala). Hands own experience in Python Programming language. Preferably, experience in BigQuery , Dataproc , Composer , Terraform , GKE , Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of DevOps. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How well support you . . . .

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Java Developer, you will be responsible for analyzing, designing, programming, debugging, and modifying software enhancements and/or new products used in various computer programs. Your expertise in Java, Spring MVC, Spring Boot, Database design, and query handling will be utilized to write code, complete programming, and perform testing and debugging of applications. You will work on local, networked, cloud-based, or Internet-related computer programs, ensuring the code meets the necessary standards for commercial or end-user applications such as materials management, financial management, HRIS, mobile apps, or desktop applications products. Your role will involve working with RESTful Web Services/Microservices for JSON creation, data parsing/processing using batch and stream mode, and messaging platforms like Kafka, Pub/Sub, ActiveMQ, among others. Proficiency in OS, Linux, virtual machines, and open source tools/platforms is crucial for successful implementation. Additionally, you will be expected to have an understanding of data modeling and storage with NoSQL or relational DBs, as well as experience with Jenkins, Containerized Microservices deployment in Cloud environments, and Big Data development (Spark, Hive, Impala, Time-series DB). To excel in this role, you should have a solid understanding of building Microservices/Webservices using Java frameworks, REST API standards and practices, and object-oriented analysis and design patterns. Experience with cloud technologies like Azure, AWS, and GCP will be advantageous. A candidate with Telecom domain experience and familiarity with protocols such as TCP, UDP, SNMP, SSH, FTP, SFTP, Corba, SOAP will be preferred. Additionally, being enthusiastic about work, passionate about coding, a self-starter, and proactive will be key qualities for success in this position. Strong communication, analytical, and problem-solving skills are essential, along with the ability to write quality/testable/modular code. Experience in Big Data platforms, participation in Agile Development methodologies, and working in a start-up environment will be beneficial. Team leading experience is an added advantage, and immediate joiners will be given special priority. If you possess the necessary skills and experience, have a keen interest in software development, and are ready to contribute to a dynamic team environment, we encourage you to apply for this role.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are currently seeking a Software Development Engineer-II for Location Program within the Data & Services group. As a Sr. Software Development Engineer (Big Data Engineer), you will be responsible for owning end-to-end delivery of engineering projects for analytics and BI solutions that leverage Mastercard dataset combined with proprietary analytics techniques. Your role will involve helping businesses worldwide solve multi-million dollar business problems. Your responsibilities will include working as a member of a support team to resolve product-related issues, demonstrating good troubleshooting skills and knowledge in support work. You should independently apply problem-solving skills to identify symptoms and root causes of issues, making effective decisions even when data is ambiguous. Providing technical guidance, support, and mentoring to junior team members will be crucial, along with actively contributing to improvement decisions and making technology recommendations that balance business needs and technical requirements. It will be essential for you to proactively understand stakeholder needs, goals, expectations, and viewpoints to deliver results effectively. You must ensure that design thinking accounts for the long-term maintainability of code. Thriving in a highly collaborative company environment where agility is paramount is expected, along with staying up to date with the latest technologies and technical advancements through self-study, blogs, meetups, conferences, etc. System maintenance, production incident problem management, identification of root cause, and issue remediation will also fall under your responsibilities. To excel in this role, you should have a Bachelor's degree in Information Technology, Computer Science, or Engineering, or equivalent work experience. A proven track record of successfully delivering complex technical assignments is required. You should possess a solid foundation in Computer Science fundamentals, web applications, and microservices-based software architecture, along with full-stack development experience, including working with databases like Oracle, Netezza, SQL Server, and hands-on experience with technologies such as Hadoop, Python, Impala, etc. Excellent SQL skills are essential, with experience working with large and complex data sources and the capability of comprehending and writing complex queries. Experience working in Agile teams and familiarity with Agile/SAFe tenets and ceremonies is necessary. Strong analytical and problem-solving abilities, along with quick adaptation to new technologies, methodologies, and systems, are crucial. Excellent English communication skills, both written and verbal, are required to effectively interact with multiple technical teams and stakeholders. To succeed in this role, you should be high-energy, detail-oriented, and proactive, with the ability to function under pressure in an independent environment. You should possess a high degree of initiative and self-motivation to drive results effectively.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

Citi Analytics & Information Management (AIM) team is a global community that objectively connects and analyzes information to create actionable intelligence for business leaders. As a C12 (Individual Contributor) AVP within the Retail Bank Fraud Analytics team in Citi AIM, your primary focus will be on analyzing transaction data to understand fraud patterns and develop strategies to mitigate fraud losses while minimizing customer impact. You will monitor strategy performance, collaborate with the implementation team, and proactively suggest fraud loss mitigation measures using new data sources and advanced analytics techniques. You will be expected to perform hands-on analysis on a regular and ad hoc basis, extract various data sources beyond transactions, generate fraud risk insights, recommend business solutions, and optimize existing rules for improved performance. Your role will require a holistic understanding of retail banking products, best practices, and the integration of analytical thinking with business knowledge to develop client-centric solutions. Ideally, you should have experience in analytics within the BFSI domain, proficiency in basic statistics, hypothesis testing, segmentation, and predictive modeling. Proficiency in decision tree models (CHAID/CART), Logistic Regression, exploratory data analysis, SAS, SQL, Hive, Impala, and Excel is essential. Knowledge of Python, prior experience in Fraud Analytics, and familiarity with Tableau or other data visualization tools are desirable. You should also have experience in stakeholder management across functions and regions, translating data into consumer insights, and effectively communicating findings to business partners and senior leaders. Your role will involve delivering clear presentations, managing projects, supporting regulatory/audit activities, and collaborating with stakeholders to drive targeting and segmentation strategies. You should possess excellent communication skills, project and process management capabilities, and the ability to work both independently and within a team environment. Qualifications: - 8+ years of analytics experience, with prior experience in Fraud analytics preferred - Advanced analytical and business strategy skills - Effective communication skills, including the ability to present to business partners and leaders - Project and process management skills - Excellent written and verbal communication skills - Experience in financial services analytics - Strong organizational skills and ability to manage multiple projects simultaneously Education: - Bachelors/University degree or equivalent experience; Masters degree preferred If you require a reasonable accommodation due to a disability for using our search tools or applying for a career opportunity, please review the Accessibility at Citi information. For more details, view Citi's EEO Policy Statement and the Know Your Rights poster.,

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Coimbatore

Work from Office

Position Name: Data Engineer Location: Coimbatore (Hybrid 3 days per week) Work Shift Timing: 1.30 pm to 10.30 pm (IST) Mandatory Skills: SCALA, Spark, Python, Data bricks Good to have: Java & Hadoop The Role: Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. Constructing infrastructure for efficient ETL processes from various sources and storage systems. Leading the implementation of algorithms and prototypes to transform raw data into useful information. Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. Creating innovative data validation methods and data analysis tools. Ensuring compliance with data governance and security policies. Interpreting data trends and patterns to establish operational alerts. Developing analytical tools, programs, and reporting mechanisms. Conducting complex data analysis and presenting results effectively. Preparing data for prescriptive and predictive modeling. Continuously exploring opportunities to enhance data quality and reliability. Applying strong programming and problem-solving skills to develop scalable solutions. Requirements: Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala). Hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. High proficiency in Scala/Java and Spark for applied large-scale data processing. Expertise with big data technologies, including Spark, Data Lake, and Hive. Solid understanding of batch and streaming data processing techniques. Proficient knowledge of the Data Lifecycle Management process, including data collection, access, use, storage, transfer, and deletion. Expert-level ability to write complex, optimized SQL queries across extensive data volumes. Experience on HDFS, Nifi, Kafka. Experience on Apache Ozone, Delta Tables, Databricks, Axon(Kafka), Spring Batch, Oracle DB Familiarity with Agile methodologies. Obsession for service observability, instrumentation, monitoring, and alerting. Knowledge or experience in architectural best practices for building data lakes. Interested candidates share your resume at Neesha1@damcogroup.com along with the below mentioned details : Total Exp : Relevant Exp in Scala & Spark : Current CTC: Expected CTC: Notice period : Current Location:

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As an Associate Managing Consultant in Strategy & Transformation at Mastercard's Performance Analytics division, you will be a part of the Advisors & Consulting Services group specializing in translating data into actionable insights. Your role will involve leveraging both Mastercard and customer data to design, implement, and scale analytical solutions for clients. By utilizing qualitative and quantitative analytical techniques and enterprise applications, you will synthesize analyses into clear recommendations and impactful narratives. In this position, you will manage deliverable development and workstreams on projects spanning various industries and problem statements. You will contribute to developing analytics strategies for large clients, leveraging data and technology solutions to unlock client value. Building and maintaining trusted relationships with client managers will be crucial, as you act as a reliable partner in creating predictive models and reviewing analytics end-products for accuracy, quality, and timeliness. Collaboration and teamwork play a significant role in this role, where you will be tasked with developing sound business recommendations, delivering effective client presentations, and leading team and external meetings. Your responsibilities will also include contributing to the firm's intellectual capital, mentoring junior consultants, and fostering effective working relationships with local and global teams. To be successful in this role, you should possess an undergraduate degree with experience in data and analytics, business intelligence, and descriptive, predictive, or prescriptive analytics. You should be adept at analyzing large datasets, synthesizing key findings, and providing recommendations through descriptive analytics and business intelligence. Proficiency in data analytics software such as Python, R, SQL, and SAS, as well as advanced skills in Word, Excel, and PowerPoint, are essential. Effective communication in English and the local office language, eligibility to work in the country of application, and a proactive attitude towards learning and growth are also required. Preferred qualifications for this role include additional experience working with the Hadoop framework, data visualization tools like Tableau and Power BI, and coaching junior delivery consultants. While an MBA or master's degree with a relevant specialization is not mandatory, having relevant industry expertise would be advantageous. At Mastercard, we prioritize information security, and every individual associated with the organization is expected to abide by security policies, maintain the confidentiality and integrity of accessed information, report any security violations or breaches, and complete required security trainings to ensure the protection of Mastercard's assets, information, and networks.,

Posted 1 week ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Mysuru

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Technology Lead Analyst role is a senior position where you will be responsible for implementing new or updated application systems and programs in collaboration with the Technology team. Your main objective will be to lead applications systems analysis and programming activities. Your responsibilities will include partnering with various management teams to ensure the integration of functions to achieve goals, identifying necessary system enhancements for new products and process improvements, resolving high-impact problems/projects by evaluating complex business processes, providing expertise in applications programming, ensuring application design aligns with the architecture blueprint, developing standards for coding, testing, debugging, and implementation, gaining comprehensive knowledge of business areas integration, analyzing issues to develop innovative solutions, advising mid-level developers and analysts, assessing risks in business decisions, and being a team player who can adapt to changing priorities. The required skills for this role include strong knowledge in Spark using Java/Scala & Hadoop Ecosystem with hands-on experience in Spark Streaming, proficiency in Java Programming with experience in the Spring Boot framework, familiarity with database technologies such as Oracle, Starburst & Impala query engine, and knowledge of bank reconciliations tools like Smartstream TLM Recs Premium / Exceptor / Quickrec is an added advantage. To qualify for this position, you should have 10+ years of relevant experience in Apps Development or systems analysis role, extensive experience in system analysis and programming of software applications, experience in managing and implementing successful projects, be a Subject Matter Expert (SME) in at least one area of Applications Development, ability to adjust priorities quickly, demonstrated leadership and project management skills, clear and concise communication skills, experience in building/implementing reporting platforms, possess a Bachelor's degree/University degree or equivalent experience (Master's degree preferred). This job description is a summary of the work performed, and other job-related duties may be assigned as needed.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning, and modeling. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Knowledge of enterprise databases such as DB2/Oracle/PostgreSQL/MYSQL/SQL Server. Hands-on knowledge and experience with tools and techniques for analysis, data manipulation, and presentation (e.g. PL/SQL, PySpark, Hive, Impala, and other scripting tools). Experience with Software Development Lifecycle using the Agile methodology. Knowledge of agile methods (SAFe, Scrum, Kanban) and tools (Jira or Confluence). Expertise in conceptual modeling; ability to see the big picture and envision possible solutions. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. We're committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events, yoga challenges, or marathons. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 week ago

Apply

5.0 - 10.0 years

13 - 18 Lacs

Pune

Work from Office

Software Developers collaborate with Business and Quality Analysts, Designers, Project Managers and more to design software solutions that will create meaningful change for our clients. They listen thoughtfully to understand the context of a business problem and write clean and iterative code to deliver a powerful end result whilst consistently advocating for better engineering practices. By balancing strong opinions with a willingness to find the right answer, Software Developers bring integrity to technology, ensuring all voices are heard. For a team to thrive, it needs collaboration and room for healthy, respectful debate. Developers are the technologists who cultivate this environment while driving teams toward delivering on an aspirational tech vision and acting as mentors for more junior-level consultants. You will leverage deep technical knowledge to solve complex business problems and proactively assess your teams health, code quality and nonfunctional requirements. Job responsibilities You will learn and adopt best practices like writing clean and reusable code using TDD, pair programming and design patterns You will use and advocate for continuous delivery practices to deliver high-quality software as well as value to end customers as early as possible You will work in collaborative, value-driven teams to build innovative customer experiences for our clients You will create large-scale distributed systems out of microservices You will collaborate with a variety of teammates to build features, design concepts and interactive prototypes and ensure best practices and UX specifications are embedded along the way. You will apply the latest technology thinking from our to solve client problems You will efficiently utilize DevSecOps tools and practices to build and deploy software, advocating devops culture and shifting security left in development You will oversee or take part in the entire cycle of software consulting and delivery from ideation to deployment and everything in between You will act as a mentor for less-experienced peers through both your technical knowledge and leadership skills Job qualifications Technical Skills We are looking for an experienced Scala Developer with 5+ years of expertise in building scalable data processing solutions. Excellent Scala and Apache Spark development skills Experience with HDFS, Hive, Impala Proficiency in OOP, design patterns, and coding best practices Experience in building real-time analytics applications, microservices, and ETL pipelines You are comfortable with Agile methodologies, such as Extreme Programming (XP), Scrum and/or Kanban You have a good awareness of TDD, continuous integration and continuous delivery approaches/tools Bonus points if you have working knowledge of cloud technology such as AWS, Azure, Kubernetes and Docker Professional Skills You enjoy influencing others and always advocate for technical excellence while being open to change when needed Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more Youre resilient in ambiguous situations and can approach challenges from multiple perspectives

Posted 1 week ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are looking for a Big Data Developer to build and maintain scalable data processing systems. The ideal candidate will have experience handling large datasets and working with distributed computing frameworks. Key Responsibilities: Design and develop data pipelines using Hadoop, Spark, or Flink. Optimize big data applications for performance and reliability. Integrate various structured and unstructured data sources. Work with data scientists and analysts to prepare datasets. Ensure data quality, security, and lineage across platforms. Required Skills & Qualifications: Experience with Hadoop ecosystem (HDFS, Hive, Pig) and Apache Spark. Proficiency in Java, Scala, or Python. Familiarity with data ingestion tools (Kafka, Sqoop, NiFi). Strong understanding of distributed computing principles. Knowledge of cloud-based big data services (e.g., EMR, Dataproc, HDInsight). Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As an experienced professional with 3-5 years in the field, you will be responsible for handling various technical tasks related to Azure Data Factory, Talend/SISS, MSSQL, Azure, and MySQL. Your expertise in Azure Data Factory will be crucial in this role. Your primary responsibilities will include demonstrating advanced knowledge of Azure SQL DB & Synapse Analytics, Power BI, SSIS, SSRS, T-SQL, and Logic Apps. Your ability to analyze and comprehend complex data sets will play a key role in your daily tasks. Proficiency in Azure Data Lake and other Azure services such as Analysis Service, SQL Databases, Azure DevOps, and CI/CD will be essential for success in this role. Additionally, a solid understanding of master data management, data warehousing, and business intelligence architecture will be required. You will be expected to have experience in data modeling and database design, with a strong grasp of SQL Server best practices. Effective communication skills, both verbal and written, will be necessary for interacting with stakeholders at all levels. A clear understanding of the data warehouse lifecycle will be beneficial, as you will be involved in preparing design documents, unit test plans, and code review reports. Experience working in an Agile environment, particularly with methodologies like Scrum, Lean, or Kanban, will be advantageous. Knowledge of big data technologies such as the Spark Framework, NoSQL, Azure Data Bricks, and the Hadoop Ecosystem (Hive, Impala, HDFS) would be a valuable asset in this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Programmer Analyst position at our organization is an intermediate level role where you will be involved in establishing and implementing new or updated application systems and programs in collaboration with the Technology team. Your primary goal will be to contribute to activities related to applications systems analysis and programming. Your responsibilities will include utilizing your knowledge of applications development procedures and concepts, as well as basic knowledge of other technical areas to identify and define necessary system enhancements. You will be expected to identify and analyze issues, provide recommendations, and implement solutions. Additionally, you will use your understanding of business processes, system processes, and industry standards to solve complex problems. Analyzing information, making evaluative judgments, recommending solutions and improvements, conducting testing and debugging, utilizing script tools, and writing basic code for design specifications are also part of your responsibilities. You will need to assess the applicability of similar experiences and evaluate options under circumstances not covered by procedures. Developing a working knowledge of various aspects such as Citis information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications will be crucial. Moreover, you are expected to appropriately assess risk when making business decisions, with a particular focus on safeguarding Citigroup, its clients, and assets by ensuring compliance with applicable laws, rules, and regulations. Qualifications for this role include 2-5 years of relevant experience, experience in programming/debugging for business applications, working knowledge of industry practices and standards, comprehensive knowledge of a specific business area for application development, working knowledge of program languages, and consistently demonstrating clear and concise written and verbal communication. Education-wise, a Bachelors degree/University degree or equivalent experience is required. In terms of skillsets, the ideal candidate should have a minimum of 3+ years of hands-on experience in Data engineering. Proficiency in Hadoop, Spark, Hive, Impala, Performance Tuning, Java programming language, SQL, and Oracle is essential. It would be considered a plus to have certifications like Java/Big Data. This job description offers an overview of the work performed in this role, and additional job-related duties may be assigned as necessary. Citi is an equal opportunity and affirmative action employer. Time Type: Full time,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Senior Data Scientist with 5+ years of experience, you will play a crucial role in our team based in Indore/Pune. Your responsibilities will involve designing and implementing models, extracting insights from data, and interpreting complex data structures to facilitate business decision-making. You should have a strong background in Machine Learning areas such as Natural Language Processing, Machine Vision, Time Series, etc. Your expertise should extend to Model Tuning, Model Validation, Supervised and Unsupervised Learning. Additionally, hands-on experience with model development, data preparation, and deployment of models for training and inference is essential. Proficiency in descriptive and inferential statistics, hypothesis testing, and data analysis and exploration are key skills required for this role. You should be adept at developing code that enables reproducible data analysis. Familiarity with AWS services like Sagemaker, Lambda, Glue, Step Functions, and EC2 is expected. Knowledge of data science code development and deployment IDEs such as Databricks, Anaconda distribution, and similar tools is essential. You should also possess expertise in ML algorithms related to time-series, natural language processing, optimization, object detection, topic modeling, clustering, and regression analysis. Your skills should include proficiency in Hive/Impala, Spark, Python, Pandas, Keras, SKLearn, StatsModels, Tensorflow, and PyTorch. Experience with end-to-end model deployment and production for at least 1 year is required. Familiarity with Model Deployment in Azure ML platform, Anaconda Enterprise, or AWS Sagemaker is preferred. Basic knowledge of deep learning algorithms like MaskedCNN, YOLO, and visualization and analytics/reporting tools such as Power BI, Tableau, Alteryx would be advantageous for this role.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies