Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
50 - 55 Lacs
bengaluru
Work from Office
Create a vision and determine KPI s for Auto on Amazon Music grounded in customer research, customer usage data and industry trends Create and drive the Auto product roadmap including feature definition, UX, tradeoffs and simplifiers Run lots of experiments design, testing, weblabs and analysis Analyze customer research and data, work with BI to create dashboards Global stakeholder management Create mechanisms to drive product alignment and decision making across marketing, labels and industry teams Communicate with senior level executives
Posted -1 days ago
10.0 - 15.0 years
10 - 14 Lacs
bengaluru
Work from Office
Analyze and report on data quality statistics for P&C policy data. Data includes limits, deductibles, exposures, and other key elements of a policy. Analyst will work with Operational Leaders to consult on needed data Required Candidate profile Ability to construct visual representations of large data sets Proficient in big data analysis tools, including SQL, R, or Python. Experience with visualization tools, such as PowerBI.
Posted -1 days ago
6.0 - 8.0 years
5 - 8 Lacs
mumbai
Hybrid
Job Description Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skillsci/cd,zeppelin,pycharm,etl tools,control-m,tableau,performance tuning,jenkins,qlikview,informatica,PySpark*
Posted -1 days ago
6.0 - 8.0 years
8 - 12 Lacs
hyderabad
Hybrid
Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skillsinformatica,jupyter notebook,api integration,unix,linux,git,aws s3,hive,cloudera,jasper,airflow,hadoop,data modeling,PySpark*
Posted -1 days ago
4.0 - 6.0 years
10 - 11 Lacs
gurugram
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: About the role: We are looking for an experienced Senior BI Developer with strong client-facing skills to lead the design, development, and implementation of complex BI solutions ensuring data accuracy and consistency, collaborating with cross-functional teams, and managing client relationships. You be required to work closely with clients to identify business requirements and develop data-driven crisp, informative & actionable BI solutions to meet the requirements. In This Role, you will: - Utilize advanced data analysis techniques to develop insightful and visually compelling BI solutions, delivering clear and concise insights to stakeholders that support data-driven decision-making. - Create and address customer facing data products. Must be well versed with data security aspects of the BI products. - Exhibit a self-starter mentality, embracing change and continuously seeking ways to improve processes and deliver high-quality results in an agile and timely process. - Continuously innovate and optimize processes and develop new and innovative ways of creating best-inclass solutions. - Keep up to date with the latest developments and emerging trends in the BI industry, and proactively identify opportunities to leverage new technologies and methodologies to enhance BI solutions. - Additionally, possess exceptional presentation skills, with the ability to effectively communicate complex data-driven insights to both technical and non-technical stakeholders, using clear and concise language and visually compelling visuals to support the narrative. - Willing to work in a hybrid working model Mandatory skill sets: Must have knowledge, skills and experiences - Experience with BI tools such as Power BI, Tableau, QlikView, or similar tools is a must. - Strong experience in data visualization, interpretation, and analysis. - Highly proficient in modern BI tools and advanced MS skills - Experience of working on Agile methodologies - Strong communication and problem-solving abilities for effective collaboration and innovative solutions Preferred skill sets: Good to have knowledge, skills and experiences - Ability to work in a fast-paced, dynamic environment. - Ability to work independently and in a team environment. - Strong attention to detail and ability to multitask Years of experience required: Experience and Qualifications 4+ years of experience in developing BI solutions. Education qualification: o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Power BI Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date
Posted -1 days ago
4.0 - 6.0 years
2 - 6 Lacs
bengaluru
Work from Office
We are seeking a skilled BI Support Analyst with 4-6 years of hands-on experience in BI support and report/dashboard generation. The ideal candidate will have strong expertise in BI tools such as Qlik Sense and Qlik NPrinting, along with solid database management and data modeling skills, particularly in SQL. Key Responsibilities: Develop, support, and maintain reports and dashboards to meet business needs. Manage and optimize databases, ensuring data accuracy and integrity. Translate business requirements into clear, actionable technical solutions. Collaborate with stakeholders to understand and address BI needs. Apply data warehousing best practices to support data integration and reporting. Utilize programming languages to enhance data analysis capabilities. Skills: 4-6 years of hands-on experience in BI support and report/dashboard creation. Proficient with Qlik Sense, Qlik NPrinting, and SQL. Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. Familiarity with data warehousing concepts. Knowledge of programming languages relevant to data analysis.
Posted -1 days ago
1.0 - 6.0 years
5 - 9 Lacs
pune, chennai, bengaluru
Work from Office
About the Role: We are seeking a skilled Business Analyst with a strong background in omnichannel strategies to join our dynamic team. The ideal candidate will have experience in analyzing and optimizing business processes across multiple channels to enhance customer experience and drive business growth. Key Responsibilities: Analyze Business Processes: Evaluate and improve business processes to ensure seamless integration across all channels (online, offline, mobile, etc.). Data Analysis: Collect, analyze, and interpret data from various sources to provide actionable insights. Project Management: Lead and manage projects related to omnichannel initiatives, ensuring timely delivery and alignment with business goals. Stakeholder Collaboration: Work closely with cross-functional teams including marketing, IT, and customer service to implement omnichannel strategies. Customer Experience Optimization: Identify opportunities to enhance customer experience across all touchpoints. Reporting: Develop and maintain reports and dashboards to track the performance of omnichannel initiatives. Qualifications: Experience: Minimum of 3-5 years of experience in a business analyst role with a focus on omnichannel strategies. Skills: Strong analytical and problem-solving skills. Proficiency in data analysis tools (e.g., SQL, Excel, Tableau). Excellent communication and presentation skills. Experience with project management methodologies. Knowledge of customer journey mapping and user experience design.
Posted -1 days ago
3.0 - 5.0 years
11 - 16 Lacs
mumbai, bengaluru
Work from Office
Mandatory skillset Programming Languages: Proficiency in SQL and Python (Robot framework) for writing automated tests and conducting data validation. Experience in Test automation frameworks (e.g., Selenium or Postman) Experience in Jira tool or any other Test management tool Basic understanding of CI/CD tools like Jenkins, GitLab, AWS CodePipeline required. Responsibilities: Develop and maintain automated test scripts using SQL and Python to validate data accuracy across banking platforms. Implement and manage test automation frameworks with tools like Selenium and Postman to streamline testing processes and improve efficiency. Utilize AWS services (e.g., S3, EC2) for testing and ensure applications perform optimally in cloud environments, focusing on scalability and reliability. Conduct security testing, ensuring adherence to secure SDLC practices and compliance with industry regulations. Good to have: Security Testing: Knowledge of SCAS, SAST, DAST/WAS, and experience within secure SDLC frameworks. Knowledge on Hadoop and its ecosystem Data-Oriented Testing Skills: Experience with data warehousing and ETL processes, as well as quality management on platforms like Databricks. Cloud Knowledge: Familiarity with AWS services (e.g., S3, EC2) Knowledge in BI tool knowledge-Qliksense/IBM Cognos Test Strategy Design: Expertise in creating test strategies for data governance, data lineage, and compliance testing. Cross-functional Collaboration: Demonstrated experience collaborating with data engineering, DevOps, and application development teams.
Posted Just now
3.0 - 5.0 years
9 - 14 Lacs
chennai, bengaluru
Work from Office
Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Responsibilities Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential.
Posted Just now
0.0 - 2.0 years
1 - 3 Lacs
mumbai
Work from Office
Sourcing of Electronics ICs, Active and Passive Components. Sourcing of Electromechanical parts and assemblies. Knowledge of Global Sourcing Knowledge of Inco terms and custom clearance Develop and manage supply base (domestic and international) to support all PCBA and other parts assemblies through NPD and Production phases Component compatibility selections / evaluations, sensor exposure, connectivity including and not limited to BLE, WiFi, Cellular etc. Work directly with key suppliers to develop and implement sourcing and cost management strategies that support existing platforms and new product development Provide guidance to internal customers on design for manufacturability and trade analysis for electronic piece parts on PCBAs, onboarding requirements, and quality Act as the supplier interface for all RFIs, RFPs and RFQs. Partner with internal resources to drive adoption of low-cost parts and avoidance of obsolescence Conduct comprehensive supplier analysis, including strategic direction, capacity, market and industry position, and risk assessment Work with Global Supply team for the process support, be part of the matrix organization. Evaluate quotes for the engineering compliance against specified manufacturing package, address technical quarries. Strong negotiation, communication, and analytical skills Knowledge of vendor sourcing practices Proficiency in purchasing software ERP Understanding of supply chain procedures Ability to identify market trends Ability to make decisions in a high-stress environment Inventory Management Proficiency in data analysis tools (e.g., Excel, Power BI) Excellent communication and presentation skills Understanding of business operations and key performance metrics Strong understanding of quality management systems Experience in manufacturing processes and quality control methodologies Excellent analytical and problem-solving skills to identify root causes of issues Effective communication and interpersonal skills to interact with suppliers and internal stakeholders Ability to work independently and as part of a team Strong attention to detail and ability to maintain accurate documentation Data analysis: Collect, analyse, and interpret data from various sources. Report generation: Create reports to support decision-making. Travel may be required to visit supplier locations Skills: Experience reading technical drawings, data sheets, fabrication drawing, BOMs and all related PCBA and harness files Knowledge of electronic components, PCBs, connectors, silicon front end and back-end assembly and test Knowledge of REACH, RoHS standard and EOL, PCN documents for electronic components. Knowledge of legal document with vendor/ EMS like SOW, NDA Knowledge of Cloud based BOM management Tools - Silicon Expert, IHS Markit, Altium 365, Source Engine etc. Proven track record of taking ownership, successfully negotiating preferred pricing, and driving results Support and Analysis for product Functional -Cost Matrix. Experience: Experience working in a supply chain, production, or engineering function. Experience in EMS is an added advantage Technical procurement or planning experience with PCB, PCBA, or Experience managing external supplier partners Key Interfaces: Engineering Team Marketing team Project Engineer / leader External vendors and suppliers Education Bachelor of Engineering/ Science ( Electronics preferred) OR Diploma in Engineering ( Electronics preferred)
Posted Just now
6.0 - 10.0 years
4 - 8 Lacs
bengaluru
Hybrid
Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key SkillsApache Spark,Python,unix,linux,performance tuning,agile methodologies,hadoop,etl,PySpark*
Posted Just now
6.0 - 11.0 years
8 - 12 Lacs
chennai
Hybrid
Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci cd , zeppelin , pycharm , pyspark , etl tools,control-m,unit test cases,tableau,performance tuning , jenkins , qlikview , informatica , jupyter notebook,api integration,unix/linux,git,aws s3 , hive , cloudera , jasper , airflow , cdc , pyspark , apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skills ci cd,zeppelin,pycharm,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix,linux,PySpark*
Posted 1 hour ago
2.0 - 5.0 years
4 - 8 Lacs
kolkata
Work from Office
Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Mandatory Key Skills ci/cd,zeppelin,pycharm,etl,control-m,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix,PySpark*
Posted 1 hour ago
2.0 - 5.0 years
6 - 10 Lacs
bengaluru
Work from Office
We are currently seeking a Required. Senior Business Analyst - Data Senior Business Analyst - Data to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job Duties: The Data Business Analysts will work in the NTT DATA CPO team, reporting to the CPO or delegate.The Data Business Analysts will manage, facilitate and execute the requirements management processes that are put in place for the ODIE programme.The Data Business Analysts will also provide input to the design and maintenance of these requirements management processes. Minimum Skills Required: JOB RESPONSIBILITIES: Understand the BA organisation, identify the relevant parties and stakeholders for DPF processes Form working relationships with relevant stakeholders to facilitate communication and understanding Proactively seek out business needs and expectations from across ODIE and relevant and suitable parts of the wider BA organisation, digest and translate these into suitable objects (e.g. requirements, epics, user stories, outcomes) Manage and communicate the processes of the requirements lifecycle and prioritisation. Ensure that requirements (and related concepts) are sufficient to allow high quality design, engineering and testing to meet the agreed expectations and needs of the business Ensure that requirements (and related concepts) are recorded, tracked, measured, audited and managed across the programme. IDEAL PRIOR EXPERIENCE: Has worked with a range of data platforms and solutions Has been involved in understanding, developing and communicating technical requirements for data solutions Experience working within a complex and/or high pressure environment with technical and business stakeholders Knowledge and experience working with a range of data technologies.
Posted 1 hour ago
7.0 years
0 Lacs
bengaluru
On-site
DESCRIPTION India Payments org comprises of three product portfolios (a) Core Payments Portfolios & Affordability solutions like IBD, NCEMI & Rewards (b) First-party Payment Instruments a.k.a. Amazon Pay Products (APP) (5 of them), and (3) Everyday Use Cases (EUC) (32 of them). These businesses are significantly different from each other, involve large data exchanges and analysis. Hence the need for AI/ML products for insights and customer targeting and deepening customer engagement is fast growing. Amazon Pay India is looking for a product manager with a strong Data Science / Tech background to guide the vision and mission of the Advanced analytics team at Amazon Pay, guiding a team of highly motivated Data Scientists and analysts Key job responsibilities The leader would work closely with software developers, category, business and operations teams to get the products developed. a) own and develop further the portfolio of data science products (such as Outlier Analysis Tools, Economic profit estimation tools, Next Best Action prediction, Advanced targeting, Spend optimization, and existing data pipes to other Amazon ML teams to guide their prediction). c) Build and interpret causal models d) Own the Propensity Data Platform (PDP) built using deep neural networks, that powers automated widgets and is used for business insights. The platform also generates customer segments for marketing, impacting an estimated 10M customers on a weekly basis and improving their customer engagement. e) Write BRDs and other Amazon standard review docs to communicate vision and progress to management. About the team The Pay data team consists of 3 verticals: a) Platform team who maintain the Single Source of truth for all pay data. b) Business partnering team who work with Business stakeholders to serve their reporting and other analysis needs c) Advanced analytics team who build AI / ML products that deal with targeting, optimisation, anomaly detection, help determine causality. These model outputs can also be consumed by business leaders for gaining deeper insight. The role being publicised is for this Advanced analytics leader. BASIC QUALIFICATIONS Bachelor's degree Experience in representing and advocating for a variety of critical customers and stakeholders during executive-level prioritization and planning Experience contributing to engineering discussions around technology decisions and strategy related to a product 7+ years of technical product or program management experience Experience owning/driving roadmap strategy and definition Experience with feature delivery and tradeoffs of a product PREFERRED QUALIFICATIONS Experience in using analytical tools, such as Tableau, Qlikview, QuickSight Master's degree Experience managing data pipelines Experience as a leader and mentor on a data science team Knowledge of Statistics Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 day ago
3.0 years
0 Lacs
bengaluru, karnataka, india
Remote
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com . This position is based in Bengaluru and will require some on-site work. Purpose And Scope As a Developer specializing in Business Intelligence (BI) and ETL (Extract, Load, Transform) technologies, you will play a crucial role in designing, developing, and optimizing data solutions for our organization. Your expertise in BI tools and ETL processes will contribute to the success of our data-driven initiatives. You’ll leverage your expertise in Qlik Sense, Tableau, and ETL technologies such as Talend, DataBricks or dbt to drive informed decision-making within the organization. Collaboration, leadership, and problem-solving skills are essential in this role. Essential Job Responsibilities Business Intelligence Tools: QLIK/Tableau: Proficiency in designing, developing, and maintaining QLIK/Tableau applications. Experience with QLIK Sense and QLIKView is highly desirable. Experience working with N-Printing, Qlik Alerting Proficient at creating and consuming complex Qlik/PowerBI data models Power BI: Strong expertise in creating interactive reports, dashboards, and visualizations using Power BI. Knowledge of DAX (Data Analysis Expressions) is essential and Power Automate (MS Flow) or PowerBI alerts. Data Modeling and Integration: Best practice knowledge of modelling data that is to be consumed by QlikSense or PowerBI Ability to design, develop and implement logical and physical data models. Familiarity with data warehousing concepts, including star schema, snowflake schema, and data marts. Understanding of ETL (Extract, Transform, Load) processes and data integration techniques. SQL and Database Management: Proficiency in SQL for querying and manipulating data. Knowledge of database management systems (e.g., SQL Server, Oracle, MySQL). Data Governance and Quality: Understanding of data governance principles, data lineage, and metadata management Experience ensuring data quality, consistency, and accuracy Experience performance tuning BI data models and reports to ensure apps are responsive to users’ needs Proven skill at creating dashboards of pixel perfect quality The ability to work with end users to uncover business requirements and turn these into powerful action orientated applications. Create and maintain technical documentation Ability to communicate and collaborate effectively with other technical team members to agree best solution An initiative-taker able to work on own but know when to collaborate with wider team to ensure best overall solution. Experience working with data warehousing and data modelling Understanding of Qlik, Tableau and or PowerBI architecture or any equivalent technology Ability to install, configure and upgrade Qlik/ PowerBI or equivalent systems Conducting Unit Testing and troubleshooting BI systems Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics Data Validation and Quality Assurance: Execute manual and automated development on data pipelines, ETL processes, and analytical models. Verify data accuracy, completeness, and consistency. Identify anomalies, discrepancies, and data quality issues. Ensure compliance with industry standards (e.g., GxP, HIPAA). Ensuring Data Security and Compliance with relevant internal, global and or regional regulation Assist users with ad-hoc queries and data troubleshooting. Use analytics to understand root causes and drive decision-making for customer acquisition and retention. Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement. Agile Champion: Adherence to DevOps principles and a proven history with CI/CD pipelines for continuous delivery. Responsibilities Development of Ownership: Collaborating with senior developers to support key development projects related to our Data Warehouse and other MI systems. Collaborate with senior team members within a multi-skilled project team. Contribute to the efficient administration of multi-server environments. Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond. Design, develop and implement robust and scalable data analytics using modern technologies. Collaborate with cross functional teams and practices across the organization including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions. Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible. Champion continuous improvement initiatives identifying opportunities to optimize performance security and maintainability of existing data and platform architecture and other technology investments. Participate in the continuous delivery pipeline. Adhering to DevOps best practices for version control automation and deployment. Ensuring effective management of the FoundationX backlog. Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization. Stay-up to date on the latest trends and technologies in data engineering and cloud platforms. Qualifications Required Bachelor's degree in computer science, Information Technology, or related field (Master’s preferred) or equivalent experience 3-5+ years’ experience as a Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment. Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement. Familiarity with Business Intelligence and Data Warehousing concepts. Web integration skills (Qlik Sense). Experience with other BI tools (Tableau, D3.js) is a plus. Strong SQL expertise and database design skills. Understanding of stored procedures, triggers, and tuning. Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains. Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical. Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools. Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization. Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery. Other Critical Skills Required Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments. Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges. Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service, and enabling strategic insights and decision-making. Preferred Experience working in the Pharma/ Lifesciences industry. Experience in storytelling with data Visualization best practices Knowledge/experience using Qlik, or PowerBI SaaS solutions. Working Environment At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas’ Responsible Flexibility Guidelines. \ Category FoundationX Astellas is committed to equality of opportunity in all aspects of employment. EOE including Disability/Protected Veterans
Posted 1 day ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
As a VP Hedge Accounting Transformation at Barclays, you will embark on a transformative journey by designing and delivering systemic solutions to the accounting specialism of Hedge Accounting. Your role involves expanding the existing product offering under IAS39, considering accounting legislation in different jurisdictions, and adopting IFRS9 and Dynamic Risk Management in the longer term. You will work on delivering extensions to the existing platform while ensuring alignment with finance architecture strategy, standardization, efficiency of operation, and meeting business requirements. Key Responsibilities: - Become a trusted advisor to Treasury, Finance, PIO, and technology colleagues regarding the Hedge Accounting Transformation programme and wider Finance business architecture strategy. - Actively drive transformation outcomes for the function through a strategic lens. - Proactively identify opportunities for improvement, develop conversations, and challenge the status quo. - Champion the transformation journey. - Provide guidance and support to Treasury transformation teams and business users across Treasury. - Present and influence key stakeholders at the Design Authority, Project Forums, and other project meetings. Qualifications Required: - Demonstrable track record within a Hedge Accounting, Treasury, or MTM Product Control environment, working on relevant projects. - Knowledge of interest rate derivatives, risk drivers, Finance process, systems, and technologies. - Professional Accounting qualification. - Range of leadership and communication styles and techniques, including influencing and negotiating with stakeholders. - Appreciation of data principles, including data modeling and design. - Strong data manipulation skills with Excel and experience using data manipulation tools (e.g., Qlikview, Business Objects, Lumira, SmartView, SQL, SAS). - Excellent Power-point skills for storyboard and presentations. Additional Company Details: The location of the role is Noida, IN. Purpose of the role: To develop business capabilities for Finance through functional design, data analysis, end-to-end processes, controls, delivery, and functional testing. Accountabilities: - Functional Design: support options analysis and recommendations, in collaboration with Line SMEs. - Data Analysis/Modelling/Governance: design conceptual data model and governance requirements. - End-to-End Process & Controls: develop target process and controls design/documentation. - Delivery/Implementation Support: update design/functional requirements, resolve RAIDS, and manage change programs. - Functional Testing: develop scripts and data for testing alignment to requirement definitions. Vice President Expectations: - Contribute to strategy, drive requirements, and make recommendations for change. - Manage policies and processes, deliver continuous improvements, and escalate policy breaches. - Advise key stakeholders and demonstrate leadership in managing risk and strengthening controls. - Collaborate with other areas, create solutions based on analytical thought, and build trusting relationships. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive.,
Posted 1 day ago
7.0 - 10.0 years
0 Lacs
pune, maharashtra, india
On-site
Job Description The ideal candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. The candidate should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems.The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. The role must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Senior Process Manager Role and responsibilities: Understand business problem and requirements by building domain knowledge and translate to data science problem. Conceptualize and design cutting edge data science solution to solve the data science problem, apply design-thinking concepts. Identify the right algorithms, tech stack, sample outputs required to efficiently adder the end need. Prototype and experiment the solution to successfully demonstrate the value. Independently or with support from team, execute the conceptualized solution as per plan by following project management guidelines. Present the results to internal and client stakeholder in an easy to understand manner with great story telling, story boarding, insights and visualization. Help build overall data science capability for eClerx through support in pilots, pre sales pitches, product development, and practice development initiatives. Technical and Functional Skills: Bachelor’s degree in Computer Science with 7 to 10 years of work experience. Must have experience in Advance Analytics, Data Science, regression, forecasting, analytics, SQL, R, Python, decision tree, random forest, SAS, clustering classification. Ability to engage clients to understand business requirements and convert the same into technical/modelling problems for solution development. Demonstrate strong interpersonal skills and a comfort interacting with clients from the C-suite to marketing managers to technical specialists. Demonstrated knowledge of analytical/statistical techniques and their applications; a working knowledge of/experience in R and Python is a plus. Demonstrated excellent communications skills, both written and spoken, as well as being able to explain complex technical concepts in plain English. Ability to present results of statistical models in business language. Domain understanding of at least one preferably two verticals amongst Retail, Cable, Technology (not mandate). Knowledge of data visualization tools (Tableau, QlikView, etc.) is a plus. Demonstrate strong analytical and storytelling skills and the ability to find relevant stories from piles of reports. Ability to manage specific tasks to completion with minimal direction. Ideal candidate has been in a consulting role previously. Hands-on expertise on the applied statistical techniques including multi-variate regression, logistic regression, market-mix models, clustering, classification, survival, churn models, speech analytics, image analytics, etc. Ability to collaborate with onsite colleagues in the US & UK. Expert in handling large data, cleansing & preparation for modelling. Very high attention to detail and quality. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law.
Posted 1 day ago
5.0 - 7.0 years
0 Lacs
gurugram, haryana, india
On-site
Role Purpose The role is responsible for performing deep data driven analysis, providing actionable insights, work with external parties to explore new attributes to help improve Credit risk for SBIC credit card portfolio and provide Loss forecasting. Further set up new reporting processes in line with internal or auditors’ requirements. Role Accountability Perform deep dive analysis on the credit card portfolio utilizing large amounts of data. Help improve the performance of credit card portfolio both in terms of underwriting and portfolio management/collections by providing actionable insights. Liase with external parties like credit bureaus to explore new attributes that can help improve credit quality. Develop and manage the credit cost forecasting model for MOU. Develop and maintain a process for loss/credit cost forecasting for new products. Provide Risk related insights on the existing products/channels etc. Liase with internal stakeholders to understand and influence credit related decisions. Understand new requirements from internal and external stakeholders and regulators and set up new processes and maintain them. Measures of Success Timely and accurate generation of all MIS & Reports Credit Forecasting Model Accuracy within established thresholds. Closure of MOU in timely fashion and timely circulation of estimates Timely and accurate delivery of analytical projects driving actionable insights. No adverse regulatory/audit findings Timely and accurate updation of process documentation Process Adherence as per MOU Technical Skills / Experience / Certifications Good knowledge of SAS, SQL, MS Office, BI tools (SAS VA / Tableau / Qlikview) etc. Knowledge of R / Python preferred Ability to understand all credit card related data elements Ability to work in a distributed data storage environment Competencies critical to the role Self-driven, able to work in teams or independently. Strong attention to detail and ability to notice discrepancies. Strong problem-solving skills. Hands on experience in driving data driven projects. Ability to prepare and present information in a structured manner. Ability to work with senior management and present insights and analysis. Qualification 5 to 7 Years Prior experience of working in Banking / NBFC / Credit Card industry / BFSI Captive Prior experience in Analytics, Automation, Business Intelligence.
Posted 2 days ago
7.0 - 10.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Roles & Responsibilities Eligibility Minimum Qualifications Bachelor’s degree in computer science or a related field OR master’s degree in statistics, economics, business economics, econometrics, or operations research. 7-10 years of experience in the Analytics/Data Science domain. Proficiency in programming languages such as Python. Experience with Generative AI techniques and tools. Familiarity with ETL methods, data imputation, data cleaning, and outlier handling. Familiarity with cloud platforms (AWS, Azure, GCP) and AI/ML services. Knowledge of databases and associated tools such as SQL. Technical Skills – Desirable Expertise in NLP and Generative AI concepts/methods/techniques like — Prompt design/engineering — Retrieval Augmented Generation (RAG), Corrective RAG and Knowledge Graph-based RAG using GPT-4o — Fine-tuning through LORA/QLORA — Multi-agentic frameworks for RAG — Reranker etc. for enhancing the plain-vanilla RAG — Evaluation frameworks like G-Eval etc. Strong understanding of Deep Learning methods and Machine Learning techniques including Ensemble methods, Support Vector Machines, and Natural Language Processing (NLP). Exposure to Big Data technologies like Hadoop, Hive, Spark. Experience with advanced reporting tools such as Tableau, Qlikview, or PowerBI. Specific Responsibilities Requirement Gathering: — Translate business requirements into actionable analytical plans in collaboration with the team. — Ensure alignment of analytical plans with the customer’s strategic objectives. Data Handling: — Identify and leverage appropriate data sources to address business problems. — Explore, diagnose, and resolve data discrepancies, including ETL tasks, missing values, and outliers. Project Leadership and Execution: — Individually lead projects, proof-of-concept (POC) initiatives, or development efforts from inception to completion. — Contribute to the development and refinement of technical and analytics architecture, ensuring it aligns with project and organizational goals. — Design and implement scalable and robust analytical frameworks and data pipelines to support advanced analytics and machine learning applications. — Manage project timelines, resources, and deliverables, coordinating with cross-functional teams to achieve project goals. — Ensure the delivery of production-ready models and solutions, meeting quality and performance standards. — Monitor success metrics to ensure high-quality output and make necessary adjustments. — Create and maintain documentation/reports. Mentoring and Team Development: — Mentor junior data scientists, providing guidance on technical challenges and career development. — Facilitate knowledge sharing within the team, promoting best practices and fostering a collaborative environment. — Conduct code reviews to ensure high-quality, maintainable, and efficient code. Innovation and Best Practices: — Stay informed about new trends in Generative AI and integrate relevant advancements into our solutions. — Implement novel applications of Generative AI algorithms and techniques in Python. — Evaluate and adopt best practices in AI/ML development to enhance the effectiveness and efficiency of our solutions. Sample Projects GenAI-powered self-serve analytics solution for a global technology giant, that leverages the power of multi-agent framework and Azure OpenAI services to provide actionable insights, recommendations, and answers to tactical questions derived from web analytics data. GenAI bot for querying on textual documents (e.g., retail audit orientation, FAQ documents, research brief document etc.) of multinational dairy company and, and getting personalized responses in a natural and conversational way, based on the structured context of the user (like their personal details), along with the citations, so that one can effortlessly carry out their first-hand validation themselves GenAI bot for querying on tabular dataset (like monthly KPI data) of leading global event agency to understand, process natural language queries on the data and generate appropriate responses in textual, tabular and visual formats. GenAI-powered advanced information retrieval from structured data of a global technology leading organization TimesFM modelling for advanced time series forecasting for a global retail chain Knowledge-Graph-based GenAI solution for knowledge retrieval and semantic summarization for a leading global event agency GenAI-powered shopping assistant solution for big-box warehouse club retail stores GenAI solution using multi-agentic framework for travel-hospitality use case Input Governance and Response Governance in GenAI Solutions Development and implementation of evaluation frameworks for GenAI solutions/applications Training Foundational Models on new data using open-source LLM or SLM Experience 8-11 Years Skills Primary Skill: Data Science Sub Skill(s): Data Science Additional Skill(s): Data Science, GenAI Fundamentals About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Data Scientist at mPokket, you will be responsible for collaborating with the data science team to plan projects and build analytics models. Your strong problem-solving skills and proficiency in statistical analysis will be key in aligning our data products with our business goals. Your primary objective will be to enhance our products and business decisions through effective utilization of data. Key Responsibilities: - Oversee the data scientists" team and data specialists, providing guidance and support. - Educate, lead, and advise colleagues on innovative techniques and solutions. - Work closely with data and software engineers to implement scalable sciences and technologies company-wide. - Conceptualize, plan, and prioritize data projects in alignment with organizational objectives. - Develop and deploy analytic systems, predictive models, and explore new techniques. - Ensure that all data projects are in sync with the company's goals. Qualifications Required: - Master's degree in Computer Science, Operations Research, Econometrics, Statistics, or a related technical field. - Minimum of 2 years of experience in solving analytical problems using quantitative approaches. - Proficiency in communicating quantitative analysis results effectively. - Knowledge of relational databases, SQL, and experience in at least one scripting language (PHP, Python, Perl, etc.). - Familiarity with statistical concepts such as hypothesis testing, regressions, and experience in manipulating data sets using statistical software (e.g., R, SAS) or other methods. Additional Details: mPokket is a company that values innovation and collaboration. The team culture encourages learning and growth, providing opportunities to work on cutting-edge technologies and projects that have a real impact on the business. Thank you for considering a career at mPokket.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Solution Design Business Analyst - Vice President in our company, you will play a crucial role in driving key strategic change initiatives for regulatory deliverables across Risk, Finance, and Treasury. To excel in this role, you should have the following skills and experience: - Required experience in business/data analysis to present complex data issues in a simple and engaging manner. - Proficiency in front to back system designing and complex business problem solutioning, including data gathering, data cleansing, and data validation. - Ability to analyze large volumes of data, identify patterns, address potential data quality issues, conduct metrics analysis, and turn analysis into actionable insights. - Experience in capturing business requirements and translating them into technical data requirements. - Strong collaboration skills to work with stakeholders and ensure proposed solutions meet their needs and expectations. - Capability to create operational and process designs to ensure proposed solutions are delivered within the agreed scope. Additionally, highly valued skills may include working experience in the financial services industry, familiarity with data analysis tools like SQL, Hypercube, Python, and data visualization/reporting tools such as Tableau, Qlikview, Power BI, and Advanced Excel, as well as expertise in data modeling and data architecture. In this role, you will be based in Pune and Chennai and will function as an Individual Contributor. The purpose of this role is to support the organization in achieving its strategic objectives by identifying business requirements and providing solutions to address business problems and opportunities. Your key responsibilities will include: - Identifying and analyzing business problems and client requirements that necessitate change within the organization. - Developing business requirements to tackle business problems and opportunities. - Collaborating with stakeholders to ensure proposed solutions align with their needs. - Supporting the creation of business cases justifying investment in proposed solutions. - Conducting feasibility studies to assess the viability of proposed solutions. - Creating reports on project progress to ensure timely and on-budget delivery of proposed solutions. - Providing support for change management activities and ensuring successful implementation and embedding of proposed solutions in the organization. As a Vice President, you are expected to contribute to setting strategy, driving requirements, and making recommendations for change. You will be responsible for planning resources, budgets, and policies, managing and maintaining policies/processes, delivering continuous improvements, and escalating breaches of policies/procedures. If you have leadership responsibilities, you are expected to demonstrate leadership behaviors that create an environment for colleagues to excel. The four LEAD behaviors include Listening and being authentic, Energizing and inspiring, Aligning across the enterprise, and Developing others. Overall, as a valuable member of our team, you are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, and demonstrate the Barclays Mindset to Empower, Challenge, and Drive in your daily interactions.,
Posted 2 days ago
1.0 - 6.0 years
0 Lacs
karnataka
On-site
Role Overview: You will be responsible for developing Analytics Based Decision Making Frameworks for clients in the Banking and Insurance sectors. Additionally, you will handle project management, client management, and support business development and new analytics solution development activities. Key Responsibilities: - Develop Analytics Based Decision Making Frameworks for clients in the Banking and Insurance sector - Manage projects effectively - Build and maintain relationships with clients - Contribute to business development and new analytics solution development - Utilize statistical modeling techniques such as Logistic Regression, Linear Regression, GLM Modeling, Time-Series Forecasting, and Scorecard Development - Use Statistics tools like SAS, Python, and R - Experience in Tableau and Qlikview would be advantageous - Apply data mining techniques such as Clustering and Segmentation - Knowledge of machine learning and Python would be beneficial Qualifications Required: - Hold a B Tech degree from a top-tier engineering school or a Masters in Statistics/Economics from a reputed university - Possess a minimum of 6 years of relevant experience, with at least 1 year of managerial experience for the Senior Consultant position - Minimum experience required is 1 year for Associate Consultant and 3 years for Consultant roles Additional Company Details: EY is committed to creating an inclusive working environment and offers flexible working arrangements to help you achieve a work-life balance. The company values collaboration, problem-solving skills, and delivering practical solutions to complex issues. EY believes in providing training, opportunities, and creative freedom to its employees, with a focus on building a better working world.,
Posted 2 days ago
4.0 years
0 Lacs
gurugram, haryana, india
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: About the role: We are looking for an experienced Senior BI Developer with strong client-facing skills to lead the design, development, and implementation of complex BI solutions ensuring data accuracy and consistency, collaborating with cross-functional teams, and managing client relationships. You be required to work closely with clients to identify business requirements and develop data-driven crisp, informative & actionable BI solutions to meet the requirements. In This Role, you will: - Utilize advanced data analysis techniques to develop insightful and visually compelling BI solutions, delivering clear and concise insights to stakeholders that support data-driven decision-making. - Create and address customer facing data products. Must be well versed with data security aspects of the BI products. - Exhibit a self-starter mentality, embracing change and continuously seeking ways to improve processes and deliver high-quality results in an agile and timely process. - Continuously innovate and optimize processes and develop new and innovative ways of creating best-inclass solutions. - Keep up to date with the latest developments and emerging trends in the BI industry, and proactively identify opportunities to leverage new technologies and methodologies to enhance BI solutions. - Additionally, possess exceptional presentation skills, with the ability to effectively communicate complex data-driven insights to both technical and non-technical stakeholders, using clear and concise language and visually compelling visuals to support the narrative. - Willing to work in a hybrid working model Mandatory skill sets: ‘Must have’ knowledge, skills and experiences - Experience with BI tools such as Power BI, Tableau, QlikView, or similar tools is a must. - Strong experience in data visualization, interpretation, and analysis. - Highly proficient in modern BI tools and advanced MS skills - Experience of working on Agile methodologies - Strong communication and problem-solving abilities for effective collaboration and innovative solutions Preferred skill sets: ‘Good to have’ knowledge, skills and experiences - Ability to work in a fast-paced, dynamic environment. - Ability to work independently and in a team environment. - Strong attention to detail and ability to multitask Years of experience required: Experience and Qualifications 4+ years of experience in developing BI solutions. Education qualification: o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Power BI Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 2 days ago
3.0 - 8.0 years
3 - 8 Lacs
hyderabad, bengaluru
Work from Office
We are currently seeking a Qlik Developer to join our team in Bangalore, Karntaka (IN-KA), India (IN). This position provides input and support for Qlik implementation activities (e.g., analyses, technical requirements, design, coding, testing, implementation of reports and automations, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides knowledge, development, maintenance, and support for applications. Responsibilities: Generates application documentation. Contributes to Qlik systems analysis and design. Designs and develops Qlik Reports. Designs and develops Qlik Automations. Designs and develops Qlik Data Sources. Monitors emerging Qlik technologies and products. Provides support for developed Qlik Sense tenants. Utilizes Microsoft Azure Devops and Git for Devops CI/CD. Qualifications: Bachelors Degree, International equivalent, or 3+ years experience. Bachelor's Degree, International equivalent, or 3+ years experience in Computer Science, Information Systems, Mathematics, Statistics or related field. Experience with Agile SDLC using Microsoft DevOps Experience with Qlik Sense. Experience developing Qlik Reports. Experience troubleshooting Qlik reporting issues. Location - Bengaluru,India,Hyderabad,Chennai,Noida,Pune,Gurugram
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
QlikView is a popular business intelligence tool that is widely used in India by various organizations to analyze and visualize data. The demand for QlikView professionals in India is on the rise, with plenty of job opportunities available in different industries. If you are a job seeker looking to explore QlikView jobs in India, this guide will provide you with valuable information to help you navigate the job market effectively.
These cities are known for their thriving IT industry and have a high demand for QlikView professionals.
The average salary range for QlikView professionals in India varies based on experience level: - Entry-level: ₹3-5 lakhs per annum - Mid-level: ₹6-10 lakhs per annum - Experienced: ₹12-20 lakhs per annum
Typically, a career in QlikView progresses as follows: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
Advancing in this career path often requires gaining experience, acquiring certifications, and staying updated with the latest trends in the field.
In addition to QlikView expertise, professionals in this field are often expected to have knowledge of: - SQL - Data visualization - Data modeling - ETL tools - Business intelligence concepts
Here are 25 interview questions that you may encounter in QlikView job interviews: - What is QlikView and how does it differ from traditional BI tools? (basic) - Explain the QlikView data model. (medium) - How do you optimize QlikView applications for better performance? (medium) - What are set analysis and set modifiers in QlikView? (advanced) - Describe the difference between direct and indirect table mapping in QlikView. (medium) - How do you handle synthetic keys in QlikView? (medium) - What is incremental load in QlikView and why is it important? (medium) - Explain the concept of QVD files in QlikView. (medium) - Can you create a circular reference in QlikView? If yes, how do you resolve it? (advanced) - What is the significance of the 'where' clause in QlikView? (basic) - How do you implement section access in QlikView? (advanced) - Describe the importance of data profiling in QlikView. (basic) - How do you handle null values in QlikView? (medium) - Explain the concept of alternate states in QlikView. (advanced) - How do you implement incremental data load in QlikView? (medium) - What is the difference between mapping load and apply map in QlikView? (medium) - How do you create a pivot table in QlikView? (basic) - Describe the difference between concatenate and join in QlikView. (medium) - What are the different types of joins in QlikView? (medium) - How do you create a multi-box object in QlikView? (basic) - Explain the concept of synthetic key and the impact on data modeling in QlikView. (medium) - How do you create a cross table in QlikView? (basic) - What are the different types of variables in QlikView and how are they used? (medium) - How do you implement incremental data extraction in QlikView? (medium) - Describe the process of data transformation in QlikView. (medium)
As you prepare for QlikView job interviews, make sure to brush up your technical skills, practice answering interview questions, and showcase your expertise confidently. With the right preparation and a positive attitude, you can land a rewarding job in the dynamic field of QlikView in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |