Home
Jobs
Companies
Resume

175 Knime Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Delhi

Remote

Looking for high growth driven individuals to join our fast-growing company: Position : Data Analyst (US Medical Billing) Location : Home Based/Remote Experience : 3+ years (with relevant experience as Data analyst) Qualification: Bachelor in IT or business intelligence. Requirement: Similar experience working as data analyst in US medical billing Strong knowledge of Data modeling, manipulation, and transformation concepts (Required) SQL 1-2 Years (Required) Advanced Excel 2-3 Years (Required) KNIME Analytics / Power Query experience is a big plus BiRT Reporting 1-2 Years (Preferred) Experience in AWS Any Programming Language and practical application (Python Preferred) Job Types: Full-time, Contractual / Temporary Benefits: Internet reimbursement Work from home Schedule: Night shift Supplemental Pay: Quarterly bonus Work Location: Remote

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of a team comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts, on projects that typically range from 3 weeks to 6 months. Delivery models on projects vary from working as part of a broader global Bain case team, BCN working independently with a Bain Associate Partner / Partner or BCN working directly with end clients. What you’ll do Contribute as a manager of a 6–12-member team comprising of Project Leaders, Associates and Analysts to build solutions / perform analyses within CP domain Work with different analytical tools and reinforce continuous understanding of (Tableau / Power BI, Alteryx / KNIME/Tableau Prep, SQL, Python, R other tools) on data from relevant data sources Ensure timely, high-quality delivery to Bain case leadership/clients through effective team management; define deliverables; prioritize and set deadlines; review work, provide feedback and ensure quality control of 2+ cases in parallel Exhibit expertise in scoping, designing and executing consumer products solutions based on client requirements & converting them into actionable tasks for the teams Brainstorm with internal & external stakeholders to understand and resolve complex issues across work streams; Generate and screen realistic answers based on sound reality checks and recommend actionable solutions under different CP domains (go-to-market strategies, negotiation strategies, pricing/promotional plans, cost optimization, etc.) Build, own and maintain key client relationships with internal Bain global CP leadership and external client teams by contributing as thought partners Identify and proactively engage on critical issues on projects and with clients; Proactively resolve roadblocks, escalate issues as needed Delivering projects relating to brand strategy, revenue management, negotiations, pricing / promotions, IP etc., relevant to the CP industry Expertise on 1 or more key sub-sectors within CP covering consumer preferences, trends, market and competitor landscape Show ability to work in a fast-paced working environment; adapt to changing client situations and expectations Effectively manage client and team meetings, deliver clear and professional presentations to the project leadership and client team Brainstorm and suggest new ways of collaborating with the BCN – on products/clients/IP etc. Work towards enhancing the efficiency of the solutions by driving innovative solutions like automation for efficiency etc. Create professional development plans to provide effective coaching/training to project leaders (PLs) and associates as direct reports Provide day-to-day coaching on work planning, problem solving, hypothesis generation and research Constructively engage in mutual feedback process with supervisor and direct reportees; Recognize accomplishment and provide concrete, regular and actionable feedback Participates in the hiring / supply building process for the CP CoE including screening profiles, interviews, induction, etc. About you Candidates should be graduates/post-graduates with strong academic records Work experience range in case highest qualification is undergraduate studies – 8-11 years of relevant experience and exposure to management consulting and data analytics relating to market / business / consumer insights, preferably in a global MNC environment within Consumer Products / Retail industry domains Work experience range in case highest qualification is postgraduate studies – 6-9 years of relevant experience and exposure to management consulting and data analytics relating to market / business / consumer insights, preferably in a global MNC environment within Consumer Products / Retail industry domains Must have professional experience in providing internal/external strategic consulting to Consumer Products clients, aimed at developing go-to-market strategies, negotiation strategies, pricing/promotional plans, cost optimization for clients Must have proven track record of managing and maintaining multiple client accounts and teams Must have ability to analyze quantitative and qualitative data to identify patterns, opportunities and gaps, and integrate across disparate industry data resources (e.g., Nielsen/IRI, Mintel, Kantar, shopper card data, client financials etc.) Must have experience applying analytics to a range of business situations and a proven ability to synthesize complex data to generate simple and clear insights Must have professional experience in analytical tools and techniques including / similar to Alteryx, Tableau, Power BI is mandatory; Understanding of Python, R, and SPSS would be a plus Strong academic credentials, analytical ability and leadership skills Must have excellent communication skills, can drive senior client/stakeholder level discussions succinctly to favorable outcomes Must have ability to deal with ambiguity and develop open ended ideas to practical results Must have maturity to lead by example, willingness to get into detail as required while also balancing delegation effectively Must be a strong team player and demonstrated ability to motivate team members Good to be updated with the latest advancements in AI, data analysis and data tools to apply best practices What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents .. Show more Show less

Posted 1 week ago

Apply

9.0 - 13.0 years

40 - 45 Lacs

Kolkata

Work from Office

Naukri logo

Knowledge of advance excel & programs like Power BI with DAX, SQL, Tableau, KNIME Python, AIML & data modelling/ETL, Data preparation & derive insights Background of business, finance & merchandise planning, Presentation skills Aptitude for descriptive analysis.

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description Responsibilities: Technical Mentorship and Code Quality: Mentor team members on coding standards, optimization, and debugging while conducting code and report reviews to enforce high code quality. Provide constructive feedback and enforce quality standards. Testing and Quality Assurance Leadership: Lead the development and implementation of rigorous testing protocols to ensure project reliability and advocate for automated test coverage. Process Improvement and Documentation: Establish and refine standards for version control, documentation, and task tracking to improve productivity and data quality. Continuously refine these processes to enhance team productivity, streamline workflows, and ensure data quality. Hands-On Technical Support: Provide expert troubleshooting support in Python, MySQL, GitKraken, Tableau and Knime, helping the team resolve complex technical issues. Provide on-demand support to team members, helping them overcome technical challenges and improve their problem-solving skills. High-Level Technical Mentorship: Provide mentorship in advanced technical areas, including best practices, data visualization and advanced Python programming. Guide the team in building scalable and reliable solutions to continual track and monitor data quality. Cross-Functional Collaboration: Partner with data scientists, product managers, and data engineers to align data requirements, testing protocols, and process improvements. Foster open communication across teams to ensure seamless integration and delivery of data solutions. Continuous Learning and Improvement: Stay updated with emerging data engineering methodologies and best practices, sharing relevant insights with the team. Drive a culture of continuous improvement, ensuring the team’s skills and processes evolve with industry standards. Data Pipelines: Design, implement and maintain scalable data pipelines for efficient data transfer, transformation, and visualization in production environments. Qualifications Skills and Experience: Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Equivalent experience in data engineering roles will also be considered. Data Integrity & Validation Experience: Strong ability to assess, validate, and ensure the integrity of large datasets with experience in identifying data inconsistencies, anomalies, and patterns that indicate data quality issues. Proficient in designing and implementing data validation frameworks. Analytical & Problem-Solving Mindset: Critical thinking with a habit of asking "why"—why anomalies exist, why trends deviate, and what underlying factors are at play. Strong diagnostic skills to identify root causes of data issues and propose actionable solutions. Ability to work with ambiguous data and derive meaningful insights. Attention to Detail: Meticulous attention to data nuances, capable of spotting subtle discrepancies. Strong focus on data accuracy, completeness, and consistency across systems. Technical Proficiency: Programming: Expert-level skills in Python, with a strong understanding of code optimization, debugging, and testing. Object-Oriented Programming (OOP) Expertise: Strong knowledge of OOP principles in Python, with the ability to understand modular, reusable, and efficient code structures. Experience in implementing OOP best practices to enhance code organization and maintainability. Data Management: Proficient in MySQL and database design, with experience in creating efficient data pipelines and workflows. Tools: Advanced knowledge of Tableau. Familiarity with Knime or similar data processing tools is a plus. Testing and QA Expertise: Proven experience in designing and implementing testing protocols, including unit, integration, and performance testing. Process-Driven Mindset: Strong experience with process improvement and documentation, particularly for coding standards, task tracking, and data management protocols. Leadership and Mentorship: Demonstrated ability to mentor and support junior and mid-level engineers, with a focus on fostering technical growth and improving team cohesion. Experience leading code reviews and guiding team members in problem-solving and troubleshooting. Problem-Solving Skills: Ability to handle complex technical issues and serve as a key resource for team troubleshooting. Expertise in guiding others through debugging and technical problem-solving. Strong Communication Skills: Ability to clearly articulate technical challenges, propose effective solutions, and align cross-functional teams on project requirements, technical standards, and data workflows. Strong at conveying complex ideas to both technical and non-technical stakeholders, ensuring transparency and collaboration. Skilled in documenting data issues, methodologies, and technical workflows for knowledge sharing. Adaptability and Continuous Learning: Stay updated on data engineering trends and foster a culture of continuous learning and process evolution within the team. Data Pipelines: Hands-on experience in building, maintaining, and optimizing ETL/ELT pipelines, including data transfer, transformation, and visualization, for real-world applications. Strong understanding of data workflows and ability to troubleshoot pipeline issues quickly with the ability to automate repetitive data processes to improve efficiency and reliability. Additional Information Perks Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Responsibilities Job Description Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Qualifications 5+ Years exp in Database Engineering. Additional Information Perks Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Role: Cloud Engineer-Operations Location: India About the Operations Tea m Includes the activities, processes and practices involved in managing and maintaining the operational aspects of an organization’s IT infrastructure and systems. It focuses on ensuring the smooth and reliable operation of IT services, infrastructure components and supporting systems in the Data & Analytics area . Duties Description: Provide expert service support as L3 specialist for the service. Identify, analyze, and develop solutions for complex incidents or problems raised by stakeholders and clients as needed. Analyze issues and develop tools and/or solutions that will help enable business continuity and mitigate business impact. Proactive and timely update assigned tasks, provide response and solution within agreed team's timelines. Problem corrective action plan proposals. Deploying bug-fixes in managed applications Gather requirements, analyze, design and implement complex visualization solutions Participate in internal knowledge sharing, collaboration activities, and service improvement initiatives Tasks may include monitoring, incident/problem resolution, documentations, automation, assessment and implementation/deployment of change requests. Provide technical feedback and mentoring to teammates. Requirements: Willing to work either ASIA, EMEA, or NALA shift Strong analytical thinking and problem-solving skills. Strong communication skillset – ability to translate technical details to business/non-technical stakeholders. Extensive experience with R language, SQL, T-SQL, PL/SQL Language – includes but not limited to ETL, merge, partition exchange, exception and error handling, performance tuning. Experience with Python/Pyspark mainly with Pandas, Numpy, Pathlib and PySpark SQL Functions Experience with Azure Fundamentals, particularly Azure Blob Storage (File Systems and AzCopy. Experience with Azure Data Services - Databricks and Data Factory Understands the operation of ETL process, triggers and scheduler Logging, dbutils, pyspark SQL functions, handling different files Experience with Git repository maintenance and DevOps concepts Familiarity with building, testing, and deploying process. Nice to have: Experience with Control-M (if no experience, required to learn on the job) KNIME Power, BI Willing to be cross-trained to all of the technologies involved in the solution. We offer: Stable employment. On the market since 2008, 1300+ talents currently on board in 7 global sites. “Office as an option” model. You can choose to work remotely or in the office. Flexibility regarding working hours and your preferred form of contract. Comprehensive online onboarding program with a “Buddy” from day 1. Cooperation with top-tier engineers and experts. Unlimited access to the Udemy learning platform from day 1. Certificate training programs. Lingarians earn 500+ technology certificates yearly. Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly. Grow as we grow as a company. 76% of our managers are internal promotions. A diverse, inclusive, and values-driven community. Autonomy to choose the way you work. We trust your ideas. Create our community together. Refer your friends to receive bonuses. Activities to support your well-being and health. Plenty of opportunities to donate to charities and support the environment. If you believe you are a good fit for the position, please create an application for us in the link given below. Application Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

The Regulatory Reporting role is a senior analyst position within the Global Regulatory Operations (RegOps) within Services Markets and Banking Operations group. The individual will support the preparation, analysis and submission of regulatory reports to various constituencies (South Korea Financial Supervisory Service (FSS) and Korea Exchange (KRX) among others. based on applicable requirements local regulatory reporting rules and instructions. The position will have a high level of visibility within the organization with opportunities to work directly with other functions within and outside Operations throughout Citi, including the Compliance, Business, Legal, Finance and Technology. Responsibilities: The candidate will ensure timely and accurate completion of the primary responsibilities detailed below to support the RegOps. The role also involves consistently improving the processes and enhance functionality, while delivering excellent service to the major stakeholders and colleagues. The primary responsibilities for the role are as follows: Ensure completeness, accuracy and timely preparation of the monthly and daily reports submitted to multiple Regulators across the APAC region. Establish controls for monitoring regulatory reports such as reconciliations, validation checks and variance analysis. Serve as a subject matter expert and central point of contact for regulatory advice and interact with Business, Legal and Compliance as needed to provide responses to regulatory inquiries. Mitigate, manage, escalate, track, and ensure resolution on issues that pose operational regulatory risk to the firm. Develop and maintain effective partnerships with all stakeholders on regulatory related issues. Implement changes to reporting process based on new or changing regulations in conjunction with Business, Operations, Technology, Compliance and Legal. Responsible front to back testing of business and regulatory change, ensuring results are documented and presented for sign off by the supervisor. Automation and Change Management for on-boarding of new reports into the Regulatory Operations Group. Assist the team during regulatory inquiries and examinations, delivering accurate information and analysis to aid regulatory reviews. Has the ability to operate with a limited level of direct supervision Can exercise independence of judgement and autonomy. Qualifications: 6+ years of financial industry work experience in Operations, Middle Office, or Projects with an in depth understanding of financial products. Operations Regulatory reporting experience will be considered favorably. Strong technical problem-solving skills and an ability to identify conflicts, discrepancies and other issue. Strong work ethic and highly motivated individual with outstanding record of professional achievement. Attention to detail, proactive in identifying, escalating and resolving potential issues Strong control and Compliance focus. Candidates who have worked in regulatory interface functions would have an added advantage. Material experience with operational risk disciplines; processes, risks, and controls. Strong leadership, interpersonal skills and project management skills Demonstrated ability to thrive in meeting Regulatory and Compliance requirements would have an added advantage. Understanding of trade lifecycle would be a plus Good verbal and written communication skills in English and in Chinese would be an advantage Utilizing Digitalization Tools including: Power BI, Tableu, Knime, and Python to automatize processes and create controls for data quality and flow; Excel to create macros to analyze large amount of data across reports. Education: Bachelor's/University degree in Finance, Accounting or related field ------------------------------------------------------ Job Family Group: Finance ------------------------------------------------------ Job Family: Regulatory Reporting ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

The Data Manager will Have a strong command over data wrangling and data story telling. Address and solve data quality issues either automated update or manually update information into KYC s sources systems or uplift the data into designated platforms. Source data from a variety of sources like core banking systems to combine, synthesise and analyse to improve data quality. Collaborate with stakeholders, SMEs to ensure that the in-scope issues are accurately rectified and meets business requirements. Good Knowledge and experience with tools for data analysis & data remediation (e.g. excel, KYC platforms, reporting tools Power BI, KNIME) Continuous Improvement & Change Understands, accepts and supports the need for change and adapts own behaviours to changing circumstances and provides input to change projects Problem Solving - Comprehensive understanding of a range of problem solving techniques Understanding depth & breadth of data - Some capabilities to source, join and derive insights from disparate data sources across several domain

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Description Digital Innovation Engineer – Chemical Engineer Location : Chennai, India Required Language : English Employment Type : Permanent Seniority Level : Associate Travel : <10% Buckman is a privately held, global specialty chemical company with headquarters in Memphis, TN, USA, committed to safeguarding the environment, maintaining safety in the workplace, and promoting sustainable development. Buckman works proactively and collaboratively with its worldwide customers in pulp and paper, leather, and water treatment to deliver exceptional service and innovative specialty chemical solutions to help boost productivity, reduce risk, improve product quality, and provide a measurable return on investment. Buckman is in the middle of a digital transformation of its businesses and focused on building the capabilities and tools in support of this. Job Description We are looking for degreed chemical engineers to staff our remote monitoring team at our Chennai offices. You will be creating and using advanced tools to monitor our customer facilities around the world in the Paper, Water, Leather and Performance Chemicals sectors. The range of responsibilities of the remote monitoring team could include detecting, diagnosing and responding in real time to system anomalies, optimizing system performance remotely, assisting with installation and commissioning, handling device management, assisting with user acceptance testing etc. You will work with key stakeholders in the sales and service front to ensure we are managing systems consistently, and efficiently. Candidates will staff Night shifts on a rotating basis. If you like working for an entrepreneurial company with a Sustainability mission and digital ambitions at the core of its strategy, Buckman is the place for you. Basic Qualifications Bachelor's degree in Chemical Engineering, Environmental Engineering, Chemistry or related field from a reputed university 5 years of work experience in the chemical engineering field Strong communications skills in English and ability to work with global stakeholders effectively An aptitude to learn digital tools and technologies Preferred Qualifications 4+ years of experience in the chemical engineering field Hands-on work experience with open-source data analytics toolsets such as R, Python, MATLAB, etc. Background and expertise in one of the following areas: Advanced use of tools like Excel Exposure to data analytics and visualization platforms such as SAS, Rapidminer, PowerBI, KNIME, etc. Software Internet of Things applications Preferred Personality Traits A strong business focus, ownership and inner self-drive to solve real-world impactful problems with innovative digital solutions. Pleasant personality and collaborates easily with others Aims high and brings a sense of urgency and ownership to all tasks A life-long learner who constantly updates skills Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

We are Lenovo. We do what we say. We own what we do. We WOW our customers. Lenovo is a US$57 billion revenue global technology powerhouse, ranked #248 in the Fortune Global 500, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company with a full-stack portfolio of AI-enabled, AI-ready, and AI-optimized devices (PCs, workstations, smartphones, tablets), infrastructure (server, storage, edge, high performance computing and software defined infrastructure), software, solutions, and services. Lenovo’s continued investment in world-changing innovation is building a more equitable, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992) (ADR: LNVGY). This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. To find out more visit www.lenovo.com, and read about the latest news via our StoryHub. DATA ANALYST Lenovo is not just another technology company. We make the technology that powers the world’s best ideas. We design tools for those who are driven by accomplishment. We are the company that powers the people who Do. The engine that helps them Do more. Do better. Do what’s never been done. And we are united in the quest to help our users defy the impossible. We are looking for driven individuals who thrive in a truly global and culture… …..come put your career on the fast track with the fastest growing PC company in the world!! Lenovo…. For Those Who Do. Lenovo is seeking a Data Analyst to support the Internal Audit function. Primary Responsibilities This position will play a pivotal role in building and growing the data analytics capabilities in Internal Audit at Lenovo. The primary role of the Data Analyst is to identify, design and develop data analytics routines to support audit activities performed by the Internal Audit team. Maintain an effective system of data analytics and models which provide enhanced insight into risks and controls, establishes an efficient/automated means to analyze and test large volumes of data for outliers, anomalies, patterns, and trends, and helps evaluate the adequacy and effectiveness of controls. Create repeatable data analytics to support continuous audit monitoring programs. Identify, design, and develop data analytics extract routines to support all phases of audit activities performed by the Internal Audit team, including risk assessment, planning, and scoping, testing and reporting. Manage data extraction, storage, transformation, and processing through data analytics routines, and generate output for visualization/analysis by the Internal Audit team. Identify and develop innovative and re-usable analytics solutions to drive auditor self-service. Use data analysis tools to automate audit testing and develop techniques for continuous auditing and analyzing large volumes of data. Interact with management and business partners to identify appropriate data sources and data elements required for analytics, applying professional skepticism when assessing data sources and validating the completeness and accuracy of data received. Transfer appropriate knowledge to the Data Champions and wider Internal Audit team in relation to data, systems, and data analytics queries Interact and collaborate with Internal Audit team members in working towards departmental goals. Execute on special projects and other assignments as requested by management. Project Management Be exceptionally skilled at identifying, prioritizing, and articulating, applying effective methods of audit testing and data analysis. Work independently on all aspects of audit engagements. Prepare complete and meticulous audit work products consistently, including preparation of data sheets and work papers within standard time frames without reminders. Proactively reviews work products of others in area of responsibility, provides highly constructive critical feedback, and takes appropriate corrective actions. Maintain excellent relationship with clients and act in a professional manner. Independently develops and implements solutions to problems. Consistently seeks opportunities to support objectives through individual and team efforts. Communication Collaboration Regularly deals with managers/senior managers/executives within the organization. Communicates confidently in a clear, concise and articulate manner, providing relevant facts to the client, team leader, and management. Alert management when concerns arise. Offer recommendations or solutions to problems. Business Acumen Be accountable to contribute to projects involving multi-functional teams. May regularly participate in overall functional projects. Make leadership and review-related decisions promptly and effectively. Understand the business and complexities of the environment, appreciate the struggles of balancing resources and company priorities, and propose solutions based on cost/benefit analysis. Actively seeks opportunities to add value, insight and facilitate business collaborations. Contribution / Leadership Independently identify, plan, and execute projects. Assist team leaders to complete phases of an audit from planning and risk assessment to testing and communication of results to achieve audit objectives. Anticipates problems or changes in plans and reviews, and recommends appropriate solutions. Consistently takes personal responsibility for the success of mentored team members and others. Position Requirements Bachelor’s degree in business, mathematics, computer science, or management information systems. 3-5 years of relevant/recent data analysis experience in audit, financial, risk management, or technology functions. Strong quantitative, analytical, data-intuition, and problem-solving skills, and proficiency in data analytics techniques. Strong knowledge and proficiency in extraction and analysis of data from a wide variety of data sources, data analysis tools (PowerBI preferred), process mining, Alteryx/KNIME, SQL, visualization tools, R, python, etc. Preferred knowledge and experience in the manufacturing, high tech industries. Experience building and supporting continuous audit monitoring programs. Working knowledge of internal controls and auditing techniques. Strong communication and relationship building skills. Must be open to travel (upto 20%) Optional Certifications: CIA, CISA We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, national origin, status as a veteran, and basis of disability or any federal, state, or local protected class. Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Delhi

On-site

ABOUT US: Bain & Company is a global consultancy that helps the world’s most ambitious change-makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition, and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. Expert Client Delivery (ECD) is an integral unit of BCN. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutions across all industries, specific domains for corporate cases, client development, private equity diligence, and Bain intellectual property. WHO YOU’LL WORK WITH: The Retail Center of Expertise collaborates with Bain’s global Retail Practice leadership, client-facing Bain leadership and teams, and with end clients on the development and delivery of Bain’s proprietary Retail products and solutions. These solutions aim to answer strategic questions of Bain’s Retail clients relating to category management, COGS optimization, pricing & promotion analytics, space optimization, and customer analytics. As part of the Retail Center of Expertise, you will work in teams comprising a mix of Directors, Managers, Projects Leads, Associates, and Analysts, on projects ranging from 3 weeks to 6 months. Delivery models on projects vary from working as part of a broader global Bain case team, working independently with a Bain Associate Partner / Partner, or working directly with end clients. WHAT YOU’LL DO: Contribute as a member of the team to build / perform solutions within Retail domain (Grocery, Apparel, General Merchandise, e-commerce, B2B retail, etc.) Own complex workstreams with support from supervisors (Project Leader / Managers / Senior Managers) Interpret, understand and break down client requirements into actionable task items for the team Execute and deliver outputs to clients by collaborating with peers and overseeing a workstream with 1-2 analysts Work with different analytical tools with focus on building expertise in Tableau / Power BI, Alteryx / KNIME, SQL, Python / R, Excel and PowerPoint Ensure timely, high quality, error-free analysis by performing sound data and reality checks Generate insights, hypotheses and come up with solutions for the pertinent issues Lead meaningful and focused meeting and effectively Communicate data, knowledge, insights & implications with clients and Bain stakeholders Work well in a team setting, understand the key aspects of delivery and deliver them with direction from senior team members Seek and provide actionable feedback in interactions ABOUT YOU: Candidates should be graduates / post-graduates with strong academic record Work experience range in case highest qualification is undergraduate studies – 2-4 years of relevant experience in global MNC environment with exposure to management consulting, business analytics, or CP/R industry domains Work experience range in case highest qualification is postgraduate studies – 1 -2 years of relevant experience in global MNC environment with exposure to management consulting, business analytics, or CP/R industry domains Must have strong communications skills, should be able to drive discussion / presentations with senior stakeholders and client maps Must have knowledge of ETL / visualization tools such as Alteryx / KNIME / SQL/ Python and Tableau / Power BI Must be proficient with Excel and PowerPoint Must have experience in applying advanced analytics to a range of business situations and a proven ability to synthesize complex data to generate simple & clear insights Good to have knowledge of R, experience with MS Azure/ Amazon Web Service Good to have statistical modelling experience Good to have knowledge of key data sources and metrics pertaining to the Retail industry with experience in one or more sub-sectors within Retail WHAT MAKES US A GREAT PLACE TO WORK: We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.

Posted 1 week ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Pune

Work from Office

Naukri logo

Must have skills: ML algorithms, NLP and Predictive Analytics, Conversational Analytics, Python, Cloud Knowledge (Azure/AWS/GCP), Proficiency in BI tools (Power BI/Tableau/ Qlik Sense/ThoughtSpot/Figma etc.). Leverages artificial intelligence and machine learning techniques to enhance BI processes along with consulting experience of AI driven BI Transformations (Defining BI Governance, Operating model, Assessment Framework etc.) Good to have skills: Knowledge of Gen AI Models implementation (Open AI, Mistral, Phi-2, Solar, Claude 3 etc.), Agentic architecture implementation WHATS IN IT FOR YOU Youll be part of a diverse, vibrant, BI Strategy team which has dynamic and innovative group of creating Data and BI Strategy & Consulting professionals. We understand the vision and mission of our customers & help them by developing innovative BI Strategy with value driven use cases, roadmap & technological solutions that drive business growth and innovation. As a key pillar of our organization, the Engineering Products team worked on various fields from BI , Data & AI perspective- BI Strategy, AI Enabled BI Strategy, Data Modelling, Data Architecture, Industry & AI Value strategy etc. that helps our customers in setting up strong data platform foundation & target roadmap to scaling & evolve towards achieving AI/GEN AI & advanced analytics vision to meet the evolving future needs of technology advancement. What you would do in this role Participate in visioning workshop & develop innovative BI Strategy & architecture frameworks tailored to the unique goals and aspirations of our clients, ensuring alignment with their evolving needs and preferences. Conduct comprehensive evaluations of clients' existing processes, technologies, and data ecosystems, uncovering opportunities for AI integration that resonate with Gen AI values and lifestyles. Align with industry & function teams, understand business problems, and translate it to BI , BI Governance , AI enabled BI and Operating Model , Develop target strategy with respect to Business Intelligence, Tools & Technology, BI Operating Model, BI Governance, AI Enabled BI needs along with Key initiatives & roadmap. Propose best suited LLMs (Large Language Model-GPT 3.5, 4, Llama etc.) as per the selection criteria framework & serve as a trusted advisor to our clients, offering expert guidance on AI Enabled BI Propose adoption strategies that resonate with BI and plan persona-based BI strategy for enterprise function. Work closely with cross-functional teams to co-create and implement innovative . He should be able to define personas and persona journey from functional and technical aspects with respect to Business Intelligence Able to architect , delivery and design BI Enabled AI solutions that address the complex challenges faced by customers and businesses. Participate in client engagements with confidence and finesse, overseeing project scope, timelines, and deliverables to ensure seamless execution and maximum impact. Facilitate engaging workshops and training sessions to empower BI Client stakeholders with the knowledge and skills needed to embrace Data driven transformation enabled through AI. Stay abreast of emerging BI , Data and AI analytics trends and technologies, continuously improving internal capabilities and offerings. Participate in BI, AI Enabled BI and Analytics aligned solutions and client demos. Who are we looking for Years of Experience: Candidates should typically have at least 6-9 years of experience in BI strategy, management, or a related field. This experience should include hands-on involvement in developing and implementing BI strategies for clients across various industries. Education: A bachelors or masters degree in computer science, Data Science, Engineering, Statistics, or a related field. Add eligibility: Strong analytical and strategic thinking skills are essential for this role. Candidates should be able to assess clients' business processes, technologies, BI, AI and data infrastructure to find opportunities for BI and AI Enabled BI integration and develop tailored BI strategy frameworks aligned with clients' goals. Displayed expertise in artificial intelligence technologies, including machine learning, natural language processing, computer vision, and deep learning. Candidates should have a solid understanding of AI concepts, algorithms, and their applications in solving real-world business problems. Excellent communication and presentation skills are crucial for effectively articulating complex technical concepts to non-technical stakeholders. Candidates should be able to convey AI strategy recommendations in a clear, concise, and compelling manner. Knowledge of cloud computing services (AWS, Azure, GCP) related to scaled AI and data analytics solutions. The ability to collaborate effectively with cross-functional teams is essential for this role. Candidates should be comfortable working with diverse teams to design and execute AI solutions that address clients' business challenges and deliver measurable results. Candidates should have a strong understanding of Gen AI preferences, behaviors, and values. Candidate should have understanding & working knowledge of various large language models to propose and implement best suited LLMs to our customer based on AI strategies that resonate with Gen AI Working experience with machine learning algorithms and statistical modeling techniques. Knowledge on MLOPs tools and services from strategic mindset will be a plus. Desired experience. Minimum 6 years of experience working with clients in the products industry (Lifesciences, CPG, Industry & Ret that are heavily influenced by AI & Gen AI preferences and behaviors, is highly valued. Candidates who have a deep understanding of AI & Gen AI trends and market dynamics can provide valuable insights and strategic guidance to clients. Minimum 5 years proven experience & deep expertise in developing and implementing AI strategy frameworks tailored to the specific needs and aims of clients within LS, Industry, CPG, and Retail sectors. The ability to craft innovative AI solutions that address industry-specific challenges and drive tangible business outcomes will set you apart. Minimum 6 years strong consulting background with a demonstrated ability to lead client engagements from start to completion. Consulting experience should encompass stakeholder management and effective communication to ensure successful project delivery. Minimum 4 years strong working experience AI implementation lifecycle & working knowledge in the AI and Analytics lifecycle, including problem definition, data preparation, model building, validation, deployment, and monitoring. Tools & Techniques Knowledge of UI/UX design using React, Angular & Visualization tools-Power BI/Tableau/Qlik Sense Knowledge on Cloud and native services around AI & GEN AI implementation-AWS/Azure/ GCP Have worked hands on Programming languages such as Python, R, SQL or Scala. Working knowledge AI and analytics platforms and tools, such as TensorFlow, PyTorch, KNIME, or similar technologies Working Knowledge on NLP (Natural Language Processing) techniques-text summarizer, text classification, Named Entity Recognition etc. Working Knowledge on various machine learning models-Supervised unsupervised, classification, regression model, Reinforcement learning, Neural networks etc., Knowledge on LLMs-Open AIGPT, Llama Index etc. Knowledge of ML Ops deployment, process tools & strategy (ML Flow, AWS Sage maker etc. law. Qualification Experience: Minimum 4 year(s) of experience is required Educational Qualification: B.Tech/BE

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Company Description: Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description: The Staff ML Scientist will work with a team to conduct world-class Applied AI research on data analytics and contribute to the long-term research agenda in large-scale data analytics and machine learning, as well as deliver innovative technologies and insights to Visa's strategic products and business. This role represents an exciting opportunity to make key contributions to Visa's strategic vision as a world-leading data-driven company. The successful candidate must have strong academic track record and demonstrate excellent software engineering skills. The successful candidate will be a self-starter comfortable with ambiguity, with strong attention to detail, and excellent collaboration skills. Essential Functions: Formulate business problems as technical data problems while ensuring key business drivers are collected in collaboration product stakeholders. Work with product engineering to ensure implement-ability of solutions. Deliver prototypes and production code based on need. Experiment with in-house and third-party data sets to test hypotheses on relevance and value of data to business problems. Build needed data transformations on structured and un-structured data. Build and experiment with modeling and scoring algorithms. This includes development of custom algorithms as well as use of packaged tools based on machine learning, analytics, and statistical techniques. Devise and implement methods for adaptive learning with controls on efficiency, methods for explaining model decisions where vital, model validation, A/B testing of models. Devise and implement methods for efficiently monitoring model efficiency and performance in production. Devise and implement methods for automation of all parts of the predictive pipeline to minimize labor in development and production. Contribute to development and adoption of shared predictive analytics infrastructure. This is a hybrid position. Hybrid employees can alternate time between both remote and office. Employees in hybrid roles are expected to work from the office 2-3 set days a week (determined by leadership/site), with a general guidepost of being in the office 50% or more of the time based on business needs. Basic Qualifications: • 7+ years of relevant work experience with a Bachelor’s Degree or at least 2 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 0 years of work experience with a PhD, OR 8+ years of relevant work experience. Preferred Qualifications: • 7 or more years of work experience with a Bachelors Degree or 4 or more years of relevant experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) or up to 3 years of relevant experience with a PhD • Relevant coursework in modeling techniques such as logistic regression, Naïve Bayes, SVM, decision trees, or neural networks. • Ability to program in one or more scripting languages such as Perl or Python and one or more programming languages such as Java, C++, or C#. • Experience with one or more common statistical tools such SAS, R, KNIME, MATLAB. • Deep learning experience with TensorFlow is a plus. • Experience with Natural Language Processing is a plus. • Experience working with large datasets using tools like Hadoop, MapReduce, Pig, or Hive is a must. • Publications or presentation in recognized Machine Learning and Data Mining journals/conferences is a plus. Show more Show less

Posted 2 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

ABOUT US: Bain & Company is a global consultancy that helps the world’s most ambitious change-makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition, and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. Expert Client Delivery (ECD) is an integral unit of BCN. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutions across all industries, specific domains for corporate cases, client development, private equity diligence, and Bain intellectual property. WHO YOU’LL WORK WITH: The Retail Center of Expertise collaborates with Bain’s global Retail Practice leadership, client-facing Bain leadership and teams, and with end clients on the development and delivery of Bain’s proprietary Retail products and solutions. These solutions aim to answer strategic questions of Bain’s Retail clients relating to category management, COGS optimization, pricing & promotion analytics, space optimization, and customer analytics. As part of the Retail Center of Expertise, you will work in teams comprising a mix of Directors, Managers, Projects Leads, Associates, and Analysts, on projects ranging from 3 weeks to 6 months. Delivery models on projects vary from working as part of a broader global Bain case team, working independently with a Bain Associate Partner / Partner, or working directly with end clients. WHAT YOU’LL DO: Contribute as a member of the team to build / perform solutions within Retail domain (Grocery, Apparel, General Merchandise, e-commerce, B2B retail, etc.) Own complex workstreams with support from supervisors (Project Leader / Managers / Senior Managers) Interpret, understand and break down client requirements into actionable task items for the team Execute and deliver outputs to clients by collaborating with peers and overseeing a workstream with 1-2 analysts Work with different analytical tools with focus on building expertise in Tableau / Power BI, Alteryx / KNIME, SQL, Python / R, Excel and PowerPoint Ensure timely, high quality, error-free analysis by performing sound data and reality checks Generate insights, hypotheses and come up with solutions for the pertinent issues Lead meaningful and focused meeting and effectively Communicate data, knowledge, insights & implications with clients and Bain stakeholders Work well in a team setting, understand the key aspects of delivery and deliver them with direction from senior team members Seek and provide actionable feedback in interactions ABOUT YOU: Candidates should be graduates / post-graduates with strong academic record Work experience range in case highest qualification is undergraduate studies – 2-4 years of relevant experience in global MNC environment with exposure to management consulting, business analytics, or CP/R industry domains Work experience range in case highest qualification is postgraduate studies – 1 -2 years of relevant experience in global MNC environment with exposure to management consulting, business analytics, or CP/R industry domains Must have strong communications skills, should be able to drive discussion / presentations with senior stakeholders and client maps Must have knowledge of ETL / visualization tools such as Alteryx / KNIME / SQL/ Python and Tableau / Power BI Must be proficient with Excel and PowerPoint Must have experience in applying advanced analytics to a range of business situations and a proven ability to synthesize complex data to generate simple & clear insights Good to have knowledge of R, experience with MS Azure/ Amazon Web Service Good to have statistical modelling experience Good to have knowledge of key data sources and metrics pertaining to the Retail industry with experience in one or more sub-sectors within Retail WHAT MAKES US A GREAT PLACE TO WORK: We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Bain & Company is a global consultancy that helps the world’s most ambitious change-makers define the future. Across 61 offices in 39 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition, and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. Expert Client Delivery (ECD) is an integral unit of BCN. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutions across all industries, specific domains for corporate cases, client development, private equity diligence, and Bain intellectual property. WHO YOU’LL WORK WITH: The Retail Center of Expertise collaborates with Bain’s global Retail Practice leadership, client-facing Bain leadership and teams, and with end clients on the development and delivery of Bain’s proprietary Retail products and solutions. These solutions aim to answer strategic questions of Bain’s Retail clients relating to category management, COGS optimization, pricing & promotion analytics, space optimization, and customer analytics. As part of the Retail Center of Expertise, you will work in teams comprising a mix of Directors, Managers, Projects Leads, Associates, and Analysts, on projects ranging from 3 weeks to 6 months. Delivery models on projects vary from working as part of a broader global Bain case team, working independently with a Bain Associate Partner / Partner, or working directly with end clients. WHAT YOU’LL DO: Contribute as a lead of the team to build / perform solutions within Retail domain (Grocery, Apparel, General Merchandise, e-commerce, B2B retail, etc.) Own individual cases with minimal support from supervisors (Manager / Senior Manager) Interpret, understand and break down client requirements into actionable work-plan for the team Support case leads in problem solving, hypothesis generation, research and insight generation Ensure timely, high quality, error-free analysis based on sound reality checks and actionable solutions Effectively Communicate data, knowledge, insights and implications with clients and Bain stakeholders Lead meaningful and focused meetings, deliver insights effectively to the internal/external client teams Build expertise on existing products and help develop newer solutions for the Retail industry Work with different analytical tools and reinforce continuous understanding of Tableau/ Power BI, Alteryx / KNIME, SQL, Python, R other tools on data from relevant retail data sources Assess and validate relevant data from different sources to leverage based on case objectives Manage team’s responsibilities which involve work-allocating, work-planning, guiding individuals, reviewing work and balancing workload within a team of analysts & associates Provide regular feedback for constant improvement, write reviews and recognize team’s and individual’s development needs Assist in recruiting, marketing and training ABOUT YOU: Candidates should be graduates/post-graduates with strong academic records Work experience range in case highest qualification is undergraduate studies – 5-8 years of relevant experience in global MNC environment with exposure to management consulting, business analytics, or CP / R industry domains Work experience range in case highest qualification is postgraduate studies – 3-6 years of relevant experience in global MNC environment with exposure to management consulting, business analytics, or CP / R industry domains Must have experience of breaking business objectives into smaller tasks and building project plans Must have strong communications skills, should be able to drive discussion / presentations with senior stakeholders and client maps Must have experience of managing internal and external stakeholders Must have led 3-4 member teams and demonstrated ability to motivate and mentor team members Must have experience in applying advanced analytics to a range of business situations & a proven ability to synthesize complex data to generate simple & clear insights Must have knowledge of ETL / visualization tools as Alteryx / KNIME / SQL and Tableau / Power BI Must be proficient with Excel, and PowerPoint Good to have knowledge of Python/ R Good to have statistical modelling experience Good to have knowledge of key data sources and metrics pertaining to the Retail industry with experience in one or more sub-sectors within Retail WHAT MAKES US A GREAT PLACE TO WORK: We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents . Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Job Description Job Overview: Position Name: Faculty, Wadhwani Center for Government Digital Transformation (WGDT) Work Location: Delhi About Wadhwani Foundation ( www.wfglobal.org ): Mission : Accelerating economic development in emerging economies through high-value job creation Objectives: Enabling the creation of 10M jobs and placement of 25M by 2030 across 20-25 emerging economies Wadhwani Foundation is a not-for-profit with the primary mission of accelerating economic development in emerging economies by driving large-scale job creation through entrepreneurship, innovation and skills development. Founded in 2000 by Silicon Valley entrepreneur, Dr. Romesh Wadhwani, today the Foundation is scaling impact in 25 countries across Asia, Africa, and Latin America through various Initiatives. More details on the various programs at the end of the document. Job Description: Learning Strategy & Subject Matter Expertise Work in conjunction with the WGDT Academy team to decide subject matter and the best methodologies for training the target audiences (central and state government bureaucrats) Create the content on Emerging Technologies such as data science, machine learning, computer vision, natural language processing, Generative AI for a senior audience of government officials with relevant social sector examples and use cases. Help formulate case studies using no/low code tools for senior policymakers. Review the learning content as designed by the Curriculum designer to ensure accuracy and depth from the subject matter perspective Research, produce and deliver high-quality learning assets like training decks, facilitator guides, learner guides, assessments, and other supporting content Learning Delivery Demonstrate strong teaching skills for a senior audience in both a classroom and virtual classroom environment and be able to modify teaching styles accordingly Manage multiple teaching projects simultaneously and liaise with the stakeholders to execute course requirements Take full responsibility for assigned cohorts from a classroom set up, to group assignments, to learning intervention, and then on to data collection on usage, assessment, quality, feedback, etc. Be able to collate and illustrate points using the flipped classroom and case study methodology, as per the major requirements of adult learning Identify and address individual learner requirements so that there is “no student left behind”, which includes follow-ups for assignments, assessments, and feedback to and from learners Demonstrate excellent stakeholder relationship management skills Use all modern communication tools like Teams, Zoom, or other learning platforms as might be required She/he has experience in both in-person and online training for a senior audience. Requirements You have at least 7 years of experience You have at least 3 years of experience in the emerging technology as trainer (freelance or full time) You possess awareness and deep knowledge of the subject area including latest analytics based technologies You can instruct senior-level learners, with a talent for effectively engaging adult students of diverse ages and backgrounds. You have competency in teach technical subjects to a non-technical audience, using simple language and avoiding excessive jargon. Work in governance and policy will be an asset but is not essential Effective verbal communication skills Technical Skills: Expert level knowledge of one or more of the Emerging Technologies such as data science, machine learning, computer vision, natural language processing, Generative AI and large language models Knowledge of a no/low code tools like Orange/Knime is helpful (but not essential) Knowledge of Python/ R is helpful (but not essential) Ability to handle and engage a heterogeneous participant base with maturity Experience in using and creating content for Virtual Learning platforms, MOOCs Experience in building new case studies, use cases and assessments in emerging technology areas At least a Bachelors’ degree Requirements n Deep experience in pedagogy and emerging techniques in adult skilling n Experience in the use of technology in large scale skilling efforts n Creation of highly engaging, world-class content for global audiences n Training of relevant teams on content and content delivery n Use of analytical methods to support content development and refinement n Conducting market assessments to identify best practices and gaps n Maintaining a strong network of content professionals to get real-time perspectives on opportunities and solutions Engagement with various stakeholders to learn about unique needs and considerations of sub-segments Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Madurai, Tamil Nadu, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: SQL, Python, Tableau, OOPs, KNIME, Data Integrity, QA Experience, ETL tools, Leadership Forbes Advisor is Looking for: Job Description: Data Integrity Manager Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. We bring rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Integrity Team is a brand-new team with the purpose of ensuring all primary, publicly accessible data collected by our researchers is correct and accurate, allowing the insights produced from this data to be reliable. They collaborate with other teams while also operating independently. Their responsibilities include monitoring data researched to ensure that errors are identified and caught as soon as possible, creating detective skills for looking for issues and mending them, setting up new configurations and ensuring they are correct, testing new developments to guarantee data quality is not compromised, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance, playing a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Data Integrity Manager will involve guiding team members through their tasks whilst looking for the next set of possible problems. They should understand about how to automate systems, optimization techniques, and best practices in debugging, testing and looking for issues. They work closely with other team members, offering technical mentorship, as well as advanced Python, SQL and data visualization practices. Responsibilities: Technical Mentorship and Code Quality: Mentor team members on coding standards, optimization, and debugging while conducting code and report reviews to enforce high code quality. Provide constructive feedback and enforce quality standards. Testing and Quality Assurance Leadership: Lead the development and implementation of rigorous testing protocols to ensure project reliability and advocate for automated test coverage. Process Improvement and Documentation: Establish and refine standards for version control, documentation, and task tracking to improve productivity and data quality. Continuously refine these processes to enhance team productivity, streamline workflows, and ensure data quality. Hands-On Technical Support: Provide expert troubleshooting support in Python, MySQL, GitKraken, Tableau and Knime, helping the team resolve complex technical issues. Provide on-demand support to team members, helping them overcome technical challenges and improve their problem-solving skills. High-Level Technical Mentorship: Provide mentorship in advanced technical areas, including best practices, data visualization and advanced Python programming. Guide the team in building scalable and reliable solutions to continual track and monitor data quality. Cross-Functional Collaboration: Partner with data scientists, product managers, and data engineers to align data requirements, testing protocols, and process improvements. Foster open communication across teams to ensure seamless integration and delivery of data solutions. Continuous Learning and Improvement: Stay updated with emerging data engineering methodologies and best practices, sharing relevant insights with the team. Drive a culture of continuous improvement, ensuring the team’s skills and processes evolve with industry standards. Data Pipelines: Design, implement and maintain scalable data pipelines for efficient data transfer, transformation, and visualization in production environments. Skills and Experience: Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Equivalent experience in data engineering roles will also be considered. Data Integrity & Validation Experience: Strong ability to assess, validate, and ensure the integrity of large datasets with experience in identifying data inconsistencies, anomalies, and patterns that indicate data quality issues. Proficient in designing and implementing data validation frameworks. Analytical & Problem-Solving Mindset: Critical thinking with a habit of asking "why"—why anomalies exist, why trends deviate, and what underlying factors are at play. Strong diagnostic skills to identify root causes of data issues and propose actionable solutions. Ability to work with ambiguous data and derive meaningful insights. Attention to Detail: Meticulous attention to data nuances, capable of spotting subtle discrepancies. Strong focus on data accuracy, completeness, and consistency across systems. Technical Proficiency: Programming: Expert-level skills in Python, with a strong understanding of code optimization, debugging, and testing. Object-Oriented Programming (OOP) Expertise: Strong knowledge of OOP principles in Python, with the ability to understand modular, reusable, and efficient code structures. Experience in implementing OOP best practices to enhance code organization and maintainability. Data Management: Proficient in MySQL and database design, with experience in creating efficient data pipelines and workflows. Tools: Advanced knowledge of Tableau. Familiarity with Knime or similar data processing tools is a plus. Testing and QA Expertise: Proven experience in designing and implementing testing protocols, including unit, integration, and performance testing. Process-Driven Mindset: Strong experience with process improvement and documentation, particularly for coding standards, task tracking, and data management protocols. Leadership and Mentorship: Demonstrated ability to mentor and support junior and mid-level engineers, with a focus on fostering technical growth and improving team cohesion. Experience leading code reviews and guiding team members in problem-solving and troubleshooting. Problem-Solving Skills: Ability to handle complex technical issues and serve as a key resource for team troubleshooting. Expertise in guiding others through debugging and technical problem-solving. Strong Communication Skills: Ability to clearly articulate technical challenges, propose effective solutions, and align cross-functional teams on project requirements, technical standards, and data workflows. Strong at conveying complex ideas to both technical and non-technical stakeholders, ensuring transparency and collaboration. Skilled in documenting data issues, methodologies, and technical workflows for knowledge sharing. Adaptability and Continuous Learning: Stay updated on data engineering trends and foster a culture of continuous learning and process evolution within the team. Data Pipelines: Hands-on experience in building, maintaining, and optimizing ETL/ELT pipelines, including data transfer, transformation, and visualization, for real-world applications. Strong understanding of data workflows and ability to troubleshoot pipeline issues quickly with the ability to automate repetitive data processes to improve efficiency and reliability. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Delhi

On-site

Job Overview: Position Name: Faculty, Wadhwani Center for Government Digital Transformation (WGDT) Work Location: Delhi About Wadhwani Foundation ( www.wfglobal.org ): Mission : Accelerating economic development in emerging economies through high-value job creation Objectives: Enabling the creation of 10M jobs and placement of 25M by 2030 across 20-25 emerging economies Wadhwani Foundation is a not-for-profit with the primary mission of accelerating economic development in emerging economies by driving large-scale job creation through entrepreneurship, innovation and skills development. Founded in 2000 by Silicon Valley entrepreneur, Dr. Romesh Wadhwani, today the Foundation is scaling impact in 25 countries across Asia, Africa, and Latin America through various Initiatives. More details on the various programs at the end of the document. Job Description: Learning Strategy & Subject Matter Expertise Work in conjunction with the WGDT Academy team to decide subject matter and the best methodologies for training the target audiences (central and state government bureaucrats) Create the content on Emerging Technologies such as data science, machine learning, computer vision, natural language processing, Generative AI for a senior audience of government officials with relevant social sector examples and use cases. Help formulate case studies using no/low code tools for senior policymakers. Review the learning content as designed by the Curriculum designer to ensure accuracy and depth from the subject matter perspective Research, produce and deliver high-quality learning assets like training decks, facilitator guides, learner guides, assessments, and other supporting content Learning Delivery Demonstrate strong teaching skills for a senior audience in both a classroom and virtual classroom environment and be able to modify teaching styles accordingly Manage multiple teaching projects simultaneously and liaise with the stakeholders to execute course requirements Take full responsibility for assigned cohorts from a classroom set up, to group assignments, to learning intervention, and then on to data collection on usage, assessment, quality, feedback, etc. Be able to collate and illustrate points using the flipped classroom and case study methodology, as per the major requirements of adult learning Identify and address individual learner requirements so that there is “no student left behind”, which includes follow-ups for assignments, assessments, and feedback to and from learners Demonstrate excellent stakeholder relationship management skills Use all modern communication tools like Teams, Zoom, or other learning platforms as might be required She/he has experience in both in-person and online training for a senior audience. Requirements You have at least 7 years of experience You have at least 3 years of experience in the emerging technology as trainer (freelance or full time) You possess awareness and deep knowledge of the subject area including latest analytics based technologies You can instruct senior-level learners, with a talent for effectively engaging adult students of diverse ages and backgrounds. You have competency in teach technical subjects to a non-technical audience, using simple language and avoiding excessive jargon. Work in governance and policy will be an asset but is not essential Effective verbal communication skills Technical skills: o Expert level knowledge of one or more of the Emerging Technologies such as data science, machine learning, computer vision, natural language processing, Generative AI and large language models o Knowledge of a no/low code tools like Orange/Knime is helpful (but not essential) o Knowledge of Python/ R is helpful (but not essential) o Ability to handle and engage a heterogeneous participant base with maturity o Experience in using and creating content for Virtual Learning platforms, MOOCs o Experience in building new case studies, use cases and assessments in emerging technology areas o At least a Bachelors’ degree Bachelors in Technology / Masters in Technology

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Vellore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Madurai, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: SQL, Python, Tableau, OOPs, KNIME, Data Integrity, QA Experience, ETL tools, Leadership Forbes Advisor is Looking for: Job Description: Data Integrity Manager Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. We bring rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Integrity Team is a brand-new team with the purpose of ensuring all primary, publicly accessible data collected by our researchers is correct and accurate, allowing the insights produced from this data to be reliable. They collaborate with other teams while also operating independently. Their responsibilities include monitoring data researched to ensure that errors are identified and caught as soon as possible, creating detective skills for looking for issues and mending them, setting up new configurations and ensuring they are correct, testing new developments to guarantee data quality is not compromised, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance, playing a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Data Integrity Manager will involve guiding team members through their tasks whilst looking for the next set of possible problems. They should understand about how to automate systems, optimization techniques, and best practices in debugging, testing and looking for issues. They work closely with other team members, offering technical mentorship, as well as advanced Python, SQL and data visualization practices. Responsibilities: Technical Mentorship and Code Quality: Mentor team members on coding standards, optimization, and debugging while conducting code and report reviews to enforce high code quality. Provide constructive feedback and enforce quality standards. Testing and Quality Assurance Leadership: Lead the development and implementation of rigorous testing protocols to ensure project reliability and advocate for automated test coverage. Process Improvement and Documentation: Establish and refine standards for version control, documentation, and task tracking to improve productivity and data quality. Continuously refine these processes to enhance team productivity, streamline workflows, and ensure data quality. Hands-On Technical Support: Provide expert troubleshooting support in Python, MySQL, GitKraken, Tableau and Knime, helping the team resolve complex technical issues. Provide on-demand support to team members, helping them overcome technical challenges and improve their problem-solving skills. High-Level Technical Mentorship: Provide mentorship in advanced technical areas, including best practices, data visualization and advanced Python programming. Guide the team in building scalable and reliable solutions to continual track and monitor data quality. Cross-Functional Collaboration: Partner with data scientists, product managers, and data engineers to align data requirements, testing protocols, and process improvements. Foster open communication across teams to ensure seamless integration and delivery of data solutions. Continuous Learning and Improvement: Stay updated with emerging data engineering methodologies and best practices, sharing relevant insights with the team. Drive a culture of continuous improvement, ensuring the team’s skills and processes evolve with industry standards. Data Pipelines: Design, implement and maintain scalable data pipelines for efficient data transfer, transformation, and visualization in production environments. Skills and Experience: Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Equivalent experience in data engineering roles will also be considered. Data Integrity & Validation Experience: Strong ability to assess, validate, and ensure the integrity of large datasets with experience in identifying data inconsistencies, anomalies, and patterns that indicate data quality issues. Proficient in designing and implementing data validation frameworks. Analytical & Problem-Solving Mindset: Critical thinking with a habit of asking "why"—why anomalies exist, why trends deviate, and what underlying factors are at play. Strong diagnostic skills to identify root causes of data issues and propose actionable solutions. Ability to work with ambiguous data and derive meaningful insights. Attention to Detail: Meticulous attention to data nuances, capable of spotting subtle discrepancies. Strong focus on data accuracy, completeness, and consistency across systems. Technical Proficiency: Programming: Expert-level skills in Python, with a strong understanding of code optimization, debugging, and testing. Object-Oriented Programming (OOP) Expertise: Strong knowledge of OOP principles in Python, with the ability to understand modular, reusable, and efficient code structures. Experience in implementing OOP best practices to enhance code organization and maintainability. Data Management: Proficient in MySQL and database design, with experience in creating efficient data pipelines and workflows. Tools: Advanced knowledge of Tableau. Familiarity with Knime or similar data processing tools is a plus. Testing and QA Expertise: Proven experience in designing and implementing testing protocols, including unit, integration, and performance testing. Process-Driven Mindset: Strong experience with process improvement and documentation, particularly for coding standards, task tracking, and data management protocols. Leadership and Mentorship: Demonstrated ability to mentor and support junior and mid-level engineers, with a focus on fostering technical growth and improving team cohesion. Experience leading code reviews and guiding team members in problem-solving and troubleshooting. Problem-Solving Skills: Ability to handle complex technical issues and serve as a key resource for team troubleshooting. Expertise in guiding others through debugging and technical problem-solving. Strong Communication Skills: Ability to clearly articulate technical challenges, propose effective solutions, and align cross-functional teams on project requirements, technical standards, and data workflows. Strong at conveying complex ideas to both technical and non-technical stakeholders, ensuring transparency and collaboration. Skilled in documenting data issues, methodologies, and technical workflows for knowledge sharing. Adaptability and Continuous Learning: Stay updated on data engineering trends and foster a culture of continuous learning and process evolution within the team. Data Pipelines: Hands-on experience in building, maintaining, and optimizing ETL/ELT pipelines, including data transfer, transformation, and visualization, for real-world applications. Strong understanding of data workflows and ability to troubleshoot pipeline issues quickly with the ability to automate repetitive data processes to improve efficiency and reliability. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Vellore, Tamil Nadu, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: SQL, Python, Tableau, OOPs, KNIME, Data Integrity, QA Experience, ETL tools, Leadership Forbes Advisor is Looking for: Job Description: Data Integrity Manager Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. We bring rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Integrity Team is a brand-new team with the purpose of ensuring all primary, publicly accessible data collected by our researchers is correct and accurate, allowing the insights produced from this data to be reliable. They collaborate with other teams while also operating independently. Their responsibilities include monitoring data researched to ensure that errors are identified and caught as soon as possible, creating detective skills for looking for issues and mending them, setting up new configurations and ensuring they are correct, testing new developments to guarantee data quality is not compromised, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance, playing a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Data Integrity Manager will involve guiding team members through their tasks whilst looking for the next set of possible problems. They should understand about how to automate systems, optimization techniques, and best practices in debugging, testing and looking for issues. They work closely with other team members, offering technical mentorship, as well as advanced Python, SQL and data visualization practices. Responsibilities: Technical Mentorship and Code Quality: Mentor team members on coding standards, optimization, and debugging while conducting code and report reviews to enforce high code quality. Provide constructive feedback and enforce quality standards. Testing and Quality Assurance Leadership: Lead the development and implementation of rigorous testing protocols to ensure project reliability and advocate for automated test coverage. Process Improvement and Documentation: Establish and refine standards for version control, documentation, and task tracking to improve productivity and data quality. Continuously refine these processes to enhance team productivity, streamline workflows, and ensure data quality. Hands-On Technical Support: Provide expert troubleshooting support in Python, MySQL, GitKraken, Tableau and Knime, helping the team resolve complex technical issues. Provide on-demand support to team members, helping them overcome technical challenges and improve their problem-solving skills. High-Level Technical Mentorship: Provide mentorship in advanced technical areas, including best practices, data visualization and advanced Python programming. Guide the team in building scalable and reliable solutions to continual track and monitor data quality. Cross-Functional Collaboration: Partner with data scientists, product managers, and data engineers to align data requirements, testing protocols, and process improvements. Foster open communication across teams to ensure seamless integration and delivery of data solutions. Continuous Learning and Improvement: Stay updated with emerging data engineering methodologies and best practices, sharing relevant insights with the team. Drive a culture of continuous improvement, ensuring the team’s skills and processes evolve with industry standards. Data Pipelines: Design, implement and maintain scalable data pipelines for efficient data transfer, transformation, and visualization in production environments. Skills and Experience: Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Equivalent experience in data engineering roles will also be considered. Data Integrity & Validation Experience: Strong ability to assess, validate, and ensure the integrity of large datasets with experience in identifying data inconsistencies, anomalies, and patterns that indicate data quality issues. Proficient in designing and implementing data validation frameworks. Analytical & Problem-Solving Mindset: Critical thinking with a habit of asking "why"—why anomalies exist, why trends deviate, and what underlying factors are at play. Strong diagnostic skills to identify root causes of data issues and propose actionable solutions. Ability to work with ambiguous data and derive meaningful insights. Attention to Detail: Meticulous attention to data nuances, capable of spotting subtle discrepancies. Strong focus on data accuracy, completeness, and consistency across systems. Technical Proficiency: Programming: Expert-level skills in Python, with a strong understanding of code optimization, debugging, and testing. Object-Oriented Programming (OOP) Expertise: Strong knowledge of OOP principles in Python, with the ability to understand modular, reusable, and efficient code structures. Experience in implementing OOP best practices to enhance code organization and maintainability. Data Management: Proficient in MySQL and database design, with experience in creating efficient data pipelines and workflows. Tools: Advanced knowledge of Tableau. Familiarity with Knime or similar data processing tools is a plus. Testing and QA Expertise: Proven experience in designing and implementing testing protocols, including unit, integration, and performance testing. Process-Driven Mindset: Strong experience with process improvement and documentation, particularly for coding standards, task tracking, and data management protocols. Leadership and Mentorship: Demonstrated ability to mentor and support junior and mid-level engineers, with a focus on fostering technical growth and improving team cohesion. Experience leading code reviews and guiding team members in problem-solving and troubleshooting. Problem-Solving Skills: Ability to handle complex technical issues and serve as a key resource for team troubleshooting. Expertise in guiding others through debugging and technical problem-solving. Strong Communication Skills: Ability to clearly articulate technical challenges, propose effective solutions, and align cross-functional teams on project requirements, technical standards, and data workflows. Strong at conveying complex ideas to both technical and non-technical stakeholders, ensuring transparency and collaboration. Skilled in documenting data issues, methodologies, and technical workflows for knowledge sharing. Adaptability and Continuous Learning: Stay updated on data engineering trends and foster a culture of continuous learning and process evolution within the team. Data Pipelines: Hands-on experience in building, maintaining, and optimizing ETL/ELT pipelines, including data transfer, transformation, and visualization, for real-world applications. Strong understanding of data workflows and ability to troubleshoot pipeline issues quickly with the ability to automate repetitive data processes to improve efficiency and reliability. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Faridabad, Haryana, India

Remote

Linkedin logo

Experience : 5.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Python, Postgre SQL, Snowflake, AWS RDS, BigQuery, OOPs, Monitoring tools, Prometheus, ETL tools, Data warehouse, Pandas, Pyspark, AWS Lambda Forbes Advisor is Looking for: Job Description: Data Research - Database Engineer Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities: Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Faridabad, Haryana, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: SQL, Python, Tableau, OOPs, KNIME, Data Integrity, QA Experience, ETL tools, Leadership Forbes Advisor is Looking for: Job Description: Data Integrity Manager Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. We bring rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Integrity Team is a brand-new team with the purpose of ensuring all primary, publicly accessible data collected by our researchers is correct and accurate, allowing the insights produced from this data to be reliable. They collaborate with other teams while also operating independently. Their responsibilities include monitoring data researched to ensure that errors are identified and caught as soon as possible, creating detective skills for looking for issues and mending them, setting up new configurations and ensuring they are correct, testing new developments to guarantee data quality is not compromised, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance, playing a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Data Integrity Manager will involve guiding team members through their tasks whilst looking for the next set of possible problems. They should understand about how to automate systems, optimization techniques, and best practices in debugging, testing and looking for issues. They work closely with other team members, offering technical mentorship, as well as advanced Python, SQL and data visualization practices. Responsibilities: Technical Mentorship and Code Quality: Mentor team members on coding standards, optimization, and debugging while conducting code and report reviews to enforce high code quality. Provide constructive feedback and enforce quality standards. Testing and Quality Assurance Leadership: Lead the development and implementation of rigorous testing protocols to ensure project reliability and advocate for automated test coverage. Process Improvement and Documentation: Establish and refine standards for version control, documentation, and task tracking to improve productivity and data quality. Continuously refine these processes to enhance team productivity, streamline workflows, and ensure data quality. Hands-On Technical Support: Provide expert troubleshooting support in Python, MySQL, GitKraken, Tableau and Knime, helping the team resolve complex technical issues. Provide on-demand support to team members, helping them overcome technical challenges and improve their problem-solving skills. High-Level Technical Mentorship: Provide mentorship in advanced technical areas, including best practices, data visualization and advanced Python programming. Guide the team in building scalable and reliable solutions to continual track and monitor data quality. Cross-Functional Collaboration: Partner with data scientists, product managers, and data engineers to align data requirements, testing protocols, and process improvements. Foster open communication across teams to ensure seamless integration and delivery of data solutions. Continuous Learning and Improvement: Stay updated with emerging data engineering methodologies and best practices, sharing relevant insights with the team. Drive a culture of continuous improvement, ensuring the team’s skills and processes evolve with industry standards. Data Pipelines: Design, implement and maintain scalable data pipelines for efficient data transfer, transformation, and visualization in production environments. Skills and Experience: Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Equivalent experience in data engineering roles will also be considered. Data Integrity & Validation Experience: Strong ability to assess, validate, and ensure the integrity of large datasets with experience in identifying data inconsistencies, anomalies, and patterns that indicate data quality issues. Proficient in designing and implementing data validation frameworks. Analytical & Problem-Solving Mindset: Critical thinking with a habit of asking "why"—why anomalies exist, why trends deviate, and what underlying factors are at play. Strong diagnostic skills to identify root causes of data issues and propose actionable solutions. Ability to work with ambiguous data and derive meaningful insights. Attention to Detail: Meticulous attention to data nuances, capable of spotting subtle discrepancies. Strong focus on data accuracy, completeness, and consistency across systems. Technical Proficiency: Programming: Expert-level skills in Python, with a strong understanding of code optimization, debugging, and testing. Object-Oriented Programming (OOP) Expertise: Strong knowledge of OOP principles in Python, with the ability to understand modular, reusable, and efficient code structures. Experience in implementing OOP best practices to enhance code organization and maintainability. Data Management: Proficient in MySQL and database design, with experience in creating efficient data pipelines and workflows. Tools: Advanced knowledge of Tableau. Familiarity with Knime or similar data processing tools is a plus. Testing and QA Expertise: Proven experience in designing and implementing testing protocols, including unit, integration, and performance testing. Process-Driven Mindset: Strong experience with process improvement and documentation, particularly for coding standards, task tracking, and data management protocols. Leadership and Mentorship: Demonstrated ability to mentor and support junior and mid-level engineers, with a focus on fostering technical growth and improving team cohesion. Experience leading code reviews and guiding team members in problem-solving and troubleshooting. Problem-Solving Skills: Ability to handle complex technical issues and serve as a key resource for team troubleshooting. Expertise in guiding others through debugging and technical problem-solving. Strong Communication Skills: Ability to clearly articulate technical challenges, propose effective solutions, and align cross-functional teams on project requirements, technical standards, and data workflows. Strong at conveying complex ideas to both technical and non-technical stakeholders, ensuring transparency and collaboration. Skilled in documenting data issues, methodologies, and technical workflows for knowledge sharing. Adaptability and Continuous Learning: Stay updated on data engineering trends and foster a culture of continuous learning and process evolution within the team. Data Pipelines: Hands-on experience in building, maintaining, and optimizing ETL/ELT pipelines, including data transfer, transformation, and visualization, for real-world applications. Strong understanding of data workflows and ability to troubleshoot pipeline issues quickly with the ability to automate repetitive data processes to improve efficiency and reliability. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies