Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Are you seeking an exciting opportunity to join a dynamic and growing team in a fast-paced and challenging environment This unique opening invites you to collaborate with the Business team to deliver a comprehensive view. As a Strategic Analytics Associate in the Fraud-Risk Strategy team, you will collaborate to produce insightful analytics and recommendations for the business regarding 1st-Party (Scams & Wrongdoer/ Mule) and 3rd Party (Account-Takeover, Third-Party-Inducement) risk identification and mitigation. Your responsibilities will include strategy development, implementation, operational controls, and performance monitoring. Key Responsibilities: - Develop and maintain regular analytics to offer management a complete understanding of emerging fraud trends. - Gain a detailed comprehension of essential performance metrics and profitability drivers to provide insights across the entire account lifecycle. - Acquire familiarity with operational processes (e.g., manual underwriting, portfolio management, collections) to grasp acquisition performance drivers. - Conduct ad hoc analytics and contribute to various projects representing Risk Management. Required Qualifications, Capabilities, and Skills: - MS degree with 3 years of Risk Management experience or BS degree with a minimum of 5 years of Risk Management or other quantitative experience. - Background in statistics, econometrics, or another quantitative field. - Advanced proficiency in SAS, SAS Enterprise Miner, or other decision tree software. - Ability to extract insights from large datasets and convert raw data into actionable management information. - Familiarity with risk analytics techniques. - Strong analytical and problem-solving skills. - Excellent written and verbal communication abilities. - Experience in presenting recommendations to management.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
vijayawada, andhra pradesh
On-site
As a Software Business Analyst, you will play a crucial role in connecting business goals with technical implementations. Your responsibilities will include gathering and analyzing requirements, documenting workflows, collaborating with development teams throughout the software development life cycle (SDLC), and contributing to the delivery of software solutions that enhance efficiency and add value. You will engage with stakeholders through various methods such as interviews, workshops, and surveys to collect and validate both functional and non-functional requirements. It will be your responsibility to document the business context, define project scopes, and articulate clear business and system requirements. Utilizing tools like BPMN or flowcharting, you will model current and future business processes. Translating business needs into user stories, API specifications, and collaborating on backlog refinement will be key aspects of your role. Your ability to conduct data-driven analysis to derive actionable insights and business recommendations will be essential. Additionally, you will coordinate and support User Acceptance Testing (UAT), prepare reports and dashboards, and share findings with stakeholders to aid decision-making processes. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Business, or a related field. You should have a minimum of 3-4 years of experience in software/IT business or systems analyst roles, with a proven track record of involvement in Agile software delivery and SDLC phases. Domain knowledge in REST APIs, databases, SaaS systems, or product-led software environments will be advantageous. To excel in this role, you will need to possess strong analytical and problem-solving skills, along with excellent communication, facilitation, and stakeholder management abilities. Proficiency in tools such as JIRA, Confluence, BPMN, SQL, and Excel (including advanced functions and pivot tables) will be necessary. Familiarity with Agile frameworks like Scrum and Kanban, as well as experience with visualization tools like Tableau or Power BI, will also be beneficial. This is a full-time position that requires in-person work. If you are looking to leverage your skills and experience in a dynamic environment where you can make a significant impact, we encourage you to apply for this Software Business Analyst role.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
haryana
On-site
At PwC, our team in audit and assurance focuses on providing independent and objective assessments of financial statements, internal controls, and other assurable information to enhance credibility and reliability for stakeholders. We evaluate compliance with regulations, assess governance and risk management processes, and related controls. As part of the data, analytics, and technology solutions team, you will assist clients in developing solutions that build trust, drive improvement, and detect, monitor, and predict risks. Your work will involve utilizing advanced analytics, data wrangling technology, and automation tools to leverage data and establish the right processes for clients to make efficient decisions based on accurate and trustworthy information. You are expected to be driven by curiosity and be a reliable team member in a fast-paced environment. Working with various clients and team members will present different challenges and scope, providing opportunities for learning and growth. Taking ownership and consistently delivering quality work that adds value for clients and contributes to team success is crucial. Building a personal brand within the firm will open doors to more opportunities for you. As an Associate, your responsibilities include designing and developing ways to automate and reimagine audits, implementing innovative technologies such as Alteryx, SQL, Python, Power BI, and PowerApps. You will develop a strong understanding of the role of data and analytics in modern audits and work on technical assignments to enhance skills in data analytics and visualization. Client engagements, data management, analytics and reporting, advanced analytics, and building relationships with engagement teams and clients are key aspects of your day-to-day responsibilities. Preferred qualifications for this role include a Bachelor's or Master's degree in Computer Science, Data Analytics, or Accounting with a minimum of 1 year of relevant experience. Candidates with Big 4 or equivalent experience are preferred. Essential skills required include market credentials in data & analytics, stakeholder management, project management, analytical and problem-solving capabilities, and a long-term career ambition at PwC. Desirable skills include finance process knowledge, audit experience, use of technology in data & analytics, and experience working in financial reporting, financial accounting, regulatory compliance, or internal audit. Technical skills needed for this role encompass data transformation and modeling, data storage and querying, data visualization, understanding data quality issues, data cleansing, robotics, finance/accounting understanding, and knowledge of current data science software platforms.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer with over 6 years of experience, you will play a crucial role in the migration management from Informatica MDM to Ataccama MDM. Your responsibilities will include developing migration strategies, plans, and timelines while ensuring data accuracy, consistency, and completeness throughout the migration process. You will be tasked with managing ETL processes to extract, transform, and load data into Ataccama. Additionally, implementing and maintaining data quality rules and processes in Ataccama, as well as overseeing API integrations for seamless data flow will be part of your daily tasks. Collaboration and coordination are vital aspects of this role, where you will work closely with cross-functional teams to gather requirements, provide training and support on Ataccama MDM, and troubleshoot migration issues in collaboration with IT and business units. Your role will also involve documentation and reporting tasks such as documenting migration processes, generating reports on migration progress and data quality metrics, and providing recommendations for continuous improvement in data management practices. To excel in this position, you should possess proven experience in migrating from Informatica MDM to Ataccama MDM, hands-on experience with ETL, data quality, and MDM processes, as well as proficiency in Ataccama MDM and related tools. Strong analytical and problem-solving skills, attention to detail, proficiency in data modeling and database management, along with excellent communication and interpersonal skills are essential for success. A Bachelor's degree in Information Management, Computer Science, Data Science, or a related field is required. Possession of Ataccama MDM certification is preferred. Additionally, holding an Australian Visa, knowledge of industry standards and regulations related to data management, proficiency in SQL and data querying languages, and a willingness to learn Ataccama are considered advantageous for this role.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Data Catalog Developer specializing in the Alation platform at Ciena, you will play a pivotal role in designing, developing, and implementing data catalog solutions. Collaborating with cross-functional teams, you will ensure that data catalog initiatives meet business needs and enhance data quality across the organization. Your expertise in data catalog development will be critical in improving data management capabilities and supporting strategic decision-making. Partnering With Business Teams You will collaborate with data owners, stewards, and business leaders to gather requirements and define data catalog strategies aligned with business objectives. Acting as a key technical resource between business units and IT teams, you will ensure seamless integration of data catalog solutions with existing systems and processes. Providing guidance and best practices on data modeling, data governance, and data catalog lifecycle management, you will drive user engagement, adoption, and continuous design and configuration across the Alation data catalog program. Project Execution Your responsibilities will include developing and implementing data catalog solutions on the Alation platform, adhering to best practices and industry standards. You will create technical specifications, design documents, and implementation plans for data catalog projects, ensuring timely delivery and high-quality outcomes. Effectively communicating technical concepts to both technical and non-technical audiences is essential for ensuring alignment and understanding across teams. You will also be responsible for system testing, resolving defects, facilitating discussions around business issues, and engaging relevant resources related to data and integration. Metrics For Success Collaborating with business and IT partners, you will define key performance indicators (KPIs) for data catalog initiatives to align with organizational goals. Establishing data quality metrics to measure the accuracy, consistency, and completeness of data within the catalog will be crucial. Tracking data catalog adoption metrics and their impact on business processes and decision-making, as well as gathering and analyzing stakeholder feedback to continuously enhance data catalog processes and solutions will be part of your success metrics. The Must Haves Education: - Bachelors or masters degree in Computer Science, Information Systems, Data Management, or a related field. Experience: - Minimum of 3-5 years of experience in data catalog development, specifically with the Alation platform, demonstrating a successful track record in delivering data catalog projects. Functional Skills - Good understanding of data governance frameworks and methodologies, including data lineage, metadata management, MDM, Reference Data Management, and compliance with data privacy regulations. - Strong understanding of data catalog and data dictionary principles, data management best practices, data quality management, and data governance practices within an Alation environment. - Experience in data querying, profiling, data cleansing, and data transformation processes. Alation Technical Skills - Subject Matter Expert for the Alation platform. - Expertise in configuring the Alation Data Model, data lineage, metadata management. - Proficient in working with Alation APIs. - Experience in managing Reference Data Management (RDM), User Management, UI Config, workflows, loading/exporting data, and optimizing processes. - Design, data modeling creation, and management of large datasets/data models. - Hands-on experience with on-boarding metadata from various sources. General Skills - Excellent verbal and written communication skills. - Strong analytical and problem-solving skills. - Experience with Agile project management methodologies and tools. Assets - Additional experience in MDM space working on Reltio platform. - Knowledge of Cloud storage solutions. - Experience with programming languages like Java, Python, JavaScript, API protocols, and data formats. - Experience with data warehouses and data visualization tools. Ciena is an Equal Opportunity Employer that values diversity and respects its employees. Accommodation measures are available upon request. Join our Talent Community for job alerts.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
Job Summary: As a Data Engineer at WNS (Holdings) Limited, your primary responsibility will be handling complex data tasks with a focus on data transformation and querying. You must possess a strong proficiency in advanced SQL techniques and a deep understanding of database structures. Your role will involve extracting and analyzing raw data to support the Reporting and Data Science team, providing both qualitative and quantitative insights to meet the business requirements. Responsibilities: - Design, develop, and maintain data transformation processes using SQL. - Manage complex data tasks related to data processing and querying. - Collaborate with the Data team to comprehend data requirements and efficiently transform data in SQL environments. - Construct and optimize data pipelines to ensure smooth data flow and transformation. - Uphold data quality and integrity throughout the data transformation process. - Address and resolve data issues promptly as they arise. - Document data processes, workflows, and transformation logic. - Engage with clients to identify reporting needs and leverage visualization experience to propose optimal solutions. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 2 years of experience in a Data Engineering role. - Profound proficiency in SQL, including advanced techniques for data manipulation and querying. - Hands-on experience with Power BI data models and DAX commands.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe. Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way. About Airbnb Capability Center: Airbnb Capability Centre was set up in 2017 in Gurgaon. We provide specialized operational services which enable Airbnb's business and functions across the world. These include Finance Technology, Finance Shared Services, Analytics, Engineering, amongst other verticals. Our offices are home to multi-skilled teams with an insightful and deep understanding of our business and community. We're hospitable, fun, and we welcome all with open arms. The Community You Will Join: The Difference You Will Make: A Typical Day: - Review and investigate alerts or cases escalated from various sources to provide supporting documentation and narratives to aid with determining if activities are unusual or out of character. - Manage large volumes of incoming cases and execute decisions under pressure of tight deadlines and regulatory time frames for TMIR. - Leverage open and closed source systems to decide potential matches in the relating queues. - Compile appropriate case notes and other documentation to support review, completed steps, recommended determinations, etc. - Gather relevant and applicable on and off-platform data and objective information to support a subjective determination to support a case through its lifecycle. - Prepare detailed reporting to support independent decisions & justification for escalations and de-escalations including extensive supporting documentation and comprehensive narratives. - Assist in the assessment of potential risks, patterns, and trends based on case review and investigations completed. - Assist and support leadership with procedure and workflow changes that may impact multiple teams and/or regions with the support of your manager. - Identify accurately and escalate compliance risks and opportunities to the Compliance Management. Your Expertise: - Bachelors Degree or Technical Equivalent. - Strong systems thinker. - Superior communication skills both written and verbal. - Strong ability to think clearly and rationally in order to understand logical connections between various data points. - Strong problem-solving skills with an emphasis on adaptability and resilience. - Ability to clearly articulate complex case patterns. - Ability to gather objective information and make a subjective determination and be able to defend that determination both verbally and textually. - Ability to work closely and build trust with your team and other teams. - Exceptional organization and process management ability. - Ability and desire to work in a fast-paced environment. - Familiarity with SQL, excel, and data querying skills is a huge preference. Hybrid Work Requirements & Expectations: To support productivity and maintain a professional hybrid work environment, employees are expected to adhere to the following: - Workspace: A dedicated, quiet, and private workspace free from interruptions and external noise. - Internet Connectivity: During the working hours, maintain a minimum and consistent internet speed of 10 Mbps on your official devices to ensure reliability for work-related tasks, including calls and virtual meetings. - Professionalism: Employees must remain fully engaged, respectful, and maintain a professional presence during virtual meetings, with video participation required unless otherwise approved. - Confidentiality & Security: Employees are responsible for protecting Airbnb's Intellectual Property and Confidential Information. Work-related activities, including calls and meetings, must not be conducted in public places, while traveling, or in any setting that may compromise confidentiality or work quality. Our Commitment To Inclusion & Belonging: Airbnb is committed to working with the broadest talent pool possible. We believe diverse ideas foster innovation and engagement, and allow us to attract creatively-led people, and to develop the best products, services, and solutions. All qualified individuals are encouraged to apply.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
uttar pradesh
On-site
As a Data Science Instructor at our dynamic team, you will play a crucial role in preparing students for the evolving data science landscape. Your primary focus will be on delivering immersive and hands-on learning experiences to equip future data professionals with practical skills in SQL, Python, Machine Learning, and real-world project development. Your responsibilities will include developing and conducting engaging lessons and workshops covering core data science topics, with a notable emphasis on SQL and Python programming. You will guide students through foundational machine learning models such as linear regression, logistic regression, decision trees, and random forests, ensuring they comprehend both the theoretical concepts and practical applications. Moreover, you will be tasked with designing and facilitating hands-on projects to help students apply their knowledge effectively and build a robust portfolio of work. Your ability to explain complex technical concepts in a clear and concise manner will be pivotal in making the material accessible to a diverse group of learners. As a mentor, you will provide constructive feedback and guidance to support students in overcoming challenges and enhancing their understanding. Collaboration with our team to continuously enhance and refine our curriculum to ensure its relevance and cutting-edge nature will also be part of your role. To excel in this position, you must have proven experience as a Data Scientist, Machine Learning Engineer, or similar role, backed by a strong portfolio that demonstrates your expertise in Python and machine learning projects. Your proficiency in Python for data analysis, manipulation, and model building, along with practical experience in common machine learning models, is essential. Additionally, your exceptional ability to design and lead hands-on projects that foster practical skill development, coupled with strong communication and presentation skills, will be key in your success. Proficiency in SQL for data querying and management is also required. Preferred qualifications include familiarity with Python libraries like Streamlit, experience with Natural Language Processing (NLP) concepts, exposure to Large Language Models (LLMs) or chatbot development, and prior experience in teaching, mentoring, or training roles. If you are enthusiastic about empowering the next generation of data scientists and possess a proven track record in building and explaining data science projects, we encourage you to reach out to us at hr@ndmit.com. Join us in shaping the future of data science education!,
Posted 2 weeks ago
2.0 - 3.0 years
3 - 7 Lacs
Gurugram, Delhi
Work from Office
Were Hiring: TL - Data Engineer Locations: Gurgaon / Delhi (Locals only) Requirements On day one we'll expect you to... Own the modules and take complete ownership of the project Understand the scope, design and business objective of the project and articulate it in the form of a design document Experienced coder in python, SQL, ETL and orchestration tools Experience in pySpark(Batch and Real-time both), Kafka, SQL, Data Querying tools. Strong experience on one of Cloud Services - GCP, AWS or Azure Familiarity with like Snowflake, or DataBricks experience is a plus Experience with containerized solutions using Google Kubernetes Engine Good communication skills to interact with internal teams and customers Develop and Optimize Data Warehouses given the schema design.
Posted 2 weeks ago
2.0 - 5.0 years
2 - 5 Lacs
Hyderabad, Telangana, India
On-site
Job Responsibilities: Perform comprehensive ETL (Extract, Transform, Load) testing to ensure data accuracy, completeness, and consistency across various data sources and targets. Design, develop, and execute automation scripts for ETL testing using Python, enhancing testing efficiency and coverage. Validate data transformations and data integrity rules across complex data pipelines. Work with SQL databases to write complex queries for data validation, reconciliation, and troubleshooting during ETL processes. Test ETL processes deployed within the Azure cloud environment, ensuring data flow and performance meet requirements. Identify, document, and track defects found during ETL testing, collaborating with development teams for timely resolution. Analyze business requirements and technical specifications to create detailed test plans and test cases for ETL processes. Required Skills: Strong hands-on experience in ETL testing methodologies and tools. Proficient in SQL for data querying, validation, and analysis. Experience in test automation, particularly with Python scripting for ETL testing. Familiarity with Azure cloud environment as it pertains to data integration and ETL processes. Good to Have: Experience working in an Agile development environment. Knowledge of other cloud platforms beyond Azure.
Posted 3 weeks ago
2.0 - 4.0 years
2 - 4 Lacs
Mumbai, Maharashtra, India
On-site
Teamware Solutions is seeking a skilled professional for the Azure Machine Learning / Databricks Engineer role. This position is crucial for designing, building, and deploying scalable machine learning solutions and data pipelines within the Microsoft Azure ecosystem. You'll work with relevant technologies, ensuring smooth operations, and contributing significantly to business objectives through expert analysis, development, implementation, and troubleshooting within the Azure Machine Learning and Databricks domain. Roles and Responsibilities: ML Solution Development: Design, develop, and implement end-to-end machine learning solutions, including data ingestion, feature engineering, model training, evaluation, and deployment, primarily using Azure Machine Learning and Azure Databricks . Data Pipeline Engineering: Build and optimize robust data pipelines for machine learning workloads, leveraging PySpark/Spark SQL within Azure Databricks for large-scale data processing and transformation. Azure ML Services: Work extensively with Azure Machine Learning services for model lifecycle management, including experiment tracking, model registry, and deploying models as web services or to Azure Kubernetes Service (AKS). Databricks Platform: Utilize Azure Databricks notebooks, clusters, and Delta Lake for collaborative data science and engineering, ensuring efficient and scalable execution of ML workloads. Model Optimization & Monitoring: Implement techniques for optimizing model performance and efficiency. Set up monitoring for deployed models to detect drift, bias, and performance degradation. Code Quality & MLOps: Write clean, maintainable, and production-ready Python/PySpark code. Contribute to MLOps practices for automating ML workflows (CI/CD for ML models). Troubleshooting: Perform in-depth troubleshooting, debugging, and issue resolution for ML models, data pipelines, and platform-related problems within Azure ML and Databricks environments. Collaboration: Work closely with data scientists, data engineers, software developers, and business stakeholders to translate machine learning concepts into deployable and impactful solutions. Preferred Candidate Profile: Azure ML Expertise: Strong hands-on experience with Microsoft Azure Machine Learning Studio and its associated services. Azure Databricks Proficiency: Proven experience in developing and optimizing data and ML solutions on Azure Databricks using PySpark/Spark SQL. Python Programming: Excellent proficiency in Python for data manipulation, machine learning, and scripting. Machine Learning Fundamentals: Solid understanding of core machine learning concepts, algorithms, and model evaluation metrics. Cloud Data Services: Familiarity with other Azure data services (e.g., Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory) is a plus. SQL Knowledge: Good proficiency in SQL for data querying and manipulation. Problem-Solving: Exceptional analytical and problem-solving skills with a methodical approach to complex data science and engineering challenges. Communication: Strong verbal and written communication skills to articulate technical solutions and collaborate effectively within a team. Education: Bachelor's degree in Computer Science, Data Science, Statistics, or a related technical field. Azure certifications related to Data Science or AI are a strong plus.
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be responsible for building systems and APIs to collect, curate, and analyze data generated by biomedical dogs, devices, and patient data. Your immediate requirements will include developing APIs and backends to handle Electronic Health Record (EHR) data, time-series sensor streams, and sensor/hardware integrations via REST APIs. Additionally, you will work on data pipelines and analytics for physiological, behavioral, and neural signals, as well as machine learning and statistical models for biomedical and detection dog research. You will also be involved in web and embedded integrations connecting software to real-world devices. To excel in this role, you should have familiarity with domains such as signal processing, basic statistics, stream processing, online algorithms, databases (especially time series databases like victoriametrics, SQL including postgres, sqlite, duckdb), computer vision, and machine learning. Proficiency in Python, C++, or Rust is essential, as the stack primarily consists of Python with some modules in Rust/C++ where necessary. Firmware development is done in C/C++ (or Rust), and if you choose to work with C++/Rust, you may need to create a Python API using pybind11/PyO3. Your responsibilities will involve developing data pipelines for real-time and batch processing, as well as building robust APIs and backends for devices, research tools, and data systems. You will handle data transformations, storage, and querying for structured and time-series datasets, evaluate and enhance ML models and analytics, and collaborate with hardware and research teams to derive insights from messy real-world data. The focus will be on ensuring data integrity and correctness rather than brute-force scaling. If you enjoy creating reliable software and working with complex real-world data, we look forward to discussing this opportunity with you. Key Skills: backend development, computer vision, data transformations, databases, analytics, data querying, C, Python, C++, signal processing, data storage, statistical models, API development, Rust, data pipelines, firmware development, stream processing, machine learning,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
jaipur, rajasthan
On-site
Amplework Software is a full-stack development agency that specializes in providing end-to-end software development solutions to clients globally. The company is dedicated to delivering high-quality products that meet business requirements by utilizing advanced technologies. Their expertise includes custom software development, mobile applications, AI-driven solutions, and enterprise applications. By joining Amplework Software, you will become part of an innovative team that is focused on driving digital transformation through technology. As an Mid-Level Python and AI Engineer at Amplework Software, your responsibilities will include assisting in building and training machine learning models using frameworks like TensorFlow, PyTorch, and Scikit-Learn. You will have the opportunity to experiment with pre-trained AI models for NLP, Computer Vision, and Predictive Analytics. Additionally, you will work with both structured and unstructured data, conduct preprocessing, and engage in feature engineering. Collaboration with data scientists and software engineers to integrate AI solutions into real-world applications is a key aspect of this role. Continuously learning, experimenting, and optimizing models to enhance performance and efficiency is also part of the job. Ideal candidates for this position should possess a strong foundation in Python, basic machine learning concepts, and a willingness to learn. Required qualifications include a Bachelor's degree in Computer Science, Engineering, AI, or a related field, proficiency in Python with experience in writing optimized and clean code, strong problem-solving skills, and an understanding of machine learning concepts such as linear regression, classification, decision trees, and feature engineering. Experience with data processing libraries like Pandas, NumPy, and Matplotlib, as well as basic knowledge of AI models and neural networks using frameworks such as Scikit-Learn, TensorFlow, or PyTorch are also required. Preferred qualifications for this role include experience with Natural Language Processing (NLP) using transformers, BERT, GPT, or OpenAI APIs, basic understanding of AI model deployment using Flask, FastAPI, or TensorFlow Serving, experience with SQL or NoSQL databases for querying datasets in AI applications, and participation in AI-related competitions, hackathons, or open-source projects. In addition to technical skills, soft skills and work ethics are also important for this role. Candidates should have a strong analytical and problem-solving mindset, the ability to work collaboratively in a team, communicate technical concepts effectively, eagerness to learn and apply new AI techniques, and excellent English communication skills, both written and verbal. Candidates who prefer strictly rule-based programming without flexibility for AI experimentation may not be suitable for this position due to the requirement for strong problem-solving skills and quick learning abilities. A face-to-face interview will be conducted, and applicants should only apply if they are able to attend the interview at the office. Joining the Amplework Software team offers the opportunity to be part of a passionate and collaborative team, work on cutting-edge projects, make a real impact, enjoy competitive benefits, and experience a great working environment.,
Posted 4 weeks ago
1.0 - 6.0 years
1 - 6 Lacs
Bengaluru, Karnataka, India
On-site
Undertake eCOA Data Management activities for assigned studies, with/without supervision. May serve in the role of back-up to a DTL. Perform testing of programming and data transfers. Understand and comply with core operating procedures and working instructions. Meet objectives as assigned and develop and maintain good communications and working relationships within eCOA DM team. Database Design and Maintenance: Create and maintain clinical databases to ensure they are efficient and meet the needs of the clinical trials. Data Querying and Reporting: Manage queries and generate reports for analysis. Data Validation: Analyze and resolve discrepancies from Recon/CTMS inquiries. Study Close-out: Execute end-to-end study close-out activities, include Archival. Identify opportunities to automate and improve data collection and management processes. Qualifications: -Bachelor s degree in pharmacy or equivalent preferred. 3-6 years of direct data management experience, preferably a minimum of 1 year as a CDM project lead. Good to have experience in electronic Clinical Outcome Assessment (eCOA) or Decentralized Clinical Trials (DCT). Advanced proficiency in English, both spoken and written. Advanced skills in computer applications like Microsoft Excel, Word, Outlook, etc. - Advanced understanding of the drug development lifecycle and overall clinical research process.
Posted 1 month ago
6.0 - 10.0 years
6 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
Data Analyst (6+ years) Insurance domain for 3-4 Years Proficiency in data querying, reporting, and visualization tools Role: Other Industry Type: Insurance Department: Other Employment Type: Full Time, Permanent Role Category: Other Education UG: B.Tech/B.E. in Any Specialization
Posted 2 months ago
6.0 - 10.0 years
6 - 10 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Data Analyst (6+ years) Insurance domain for 3-4 Years Proficiency in data querying, reporting, and visualization tools Role: Other Industry Type: Insurance Department: Other Employment Type: Full Time, Permanent Role Category: Other Education UG: B.Tech/B.E. in Any Specialization
Posted 2 months ago
8 - 12 years
7 - 12 Lacs
Gurugram
Work from Office
Please find the brief job description for Team Lead - Product Operations role. Role - Team Lead - Prod Ops Experience : 8-12 years Location : Gurgaon Shift timings : Day shift Work mode: Hybrid work model with Job Description : Team Lead - Prod Ops with 8 to 12 years of experience with strong technical skills in Business Operations, SQL, Python, Alteryx, and Product Analytics. Experience in Retail Banking and Cards & Payments is a plus. This is a hybrid work model with day shifts and no travel required. Responsibilities Lead the product operations team to ensure seamless business operations and high-quality deliverables. Oversee the development and implementation of SQL queries to support data analysis and reporting. Provide expertise in Python programming to automate and optimize business processes. Utilize Alteryx for data blending and advanced analytics to drive business insights. Conduct product analytics to measure performance and identify areas for improvement. Collaborate with cross-functional teams to align product operations with business objectives. Ensure data integrity and accuracy in all business operations and reporting. Develop and maintain documentation for all processes and procedures. Monitor and analyze key performance indicators to drive continuous improvement. Mentor and guide team members to enhance their technical and analytical skills. Foster a collaborative and innovative team environment. Communicate effectively with stakeholders to provide updates and gather requirements. Stay updated with industry trends and best practices to ensure the team remains competitive. Qualifications Strong experience in business operations with a focus on data analysis and process optimization. Proficiency in SQL for data querying and manipulation . Expertise in Python for automation and advanced analytics . Hands-on experience with Alteryx for data blending and analytics. Solid understanding of product analytics and performance measurement. Experience in Retail Banking and Cards & Payments is a plus. Excellent communication and leadership skills. Ability to work effectively in a hybrid work model. Strong problem-solving skills and attention to detail. Ability to mentor and develop team members. Knowledge of industry trends and best practices. Strong organizational and documentation skills. Ability to collaborate with cross-functional teams effectively.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40183 Jobs | Dublin
Wipro
19422 Jobs | Bengaluru
Accenture in India
16540 Jobs | Dublin 2
EY
15533 Jobs | London
Uplers
11632 Jobs | Ahmedabad
Amazon
10667 Jobs | Seattle,WA
Oracle
9549 Jobs | Redwood City
IBM
9337 Jobs | Armonk
Accenture services Pvt Ltd
8190 Jobs |
Capgemini
7921 Jobs | Paris,France