Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 3.0 years
9 - 13 Lacs
Kochi, Chennai
Work from Office
Key Responsibilities: Apply deep domain knowledge in structural biology to characterize macromolecular complexes and understand their roles in disease pathways. Integrate structural insights with multi-omics data (genomics, transcriptomics, proteomics, and metabolomics) to understand molecular mechanisms of disease. Utilize and develop computational tools for structural modeling, molecular dynamics simulations, and ligand docking to complement experimental data. Stay up to date with the latest advancements in structural biology, biophysics, and computational methods to incorporate cutting-edge research into disease modeling efforts. Collaboration & Project Management: Work closely with cross-functional teams, including disease domain experts, computational scientists, and clinicians to enhance translational research efforts. Work with the Scientific Manager to help in project planning, execution, and reporting, ensuring alignment with research objectives. Communicate findings effectively through reports, presentations, and discussions with internal teams and external collaborators. Publish research findings in high-impact, peer-reviewed journals. Present work at scientific conferences, symposia, and internal research meetings. Contribute to grant applications and funding proposals where relevant. Qualifications & Experience: PhD in structural biology, biophysics, biochemistry, or a closely related field. Strong expertise in at least one major structural biology technique (e.g., X-ray crystallography, cryo-electron microscopy (cryo-EM), NMR spectroscopy). Solid understanding of protein structure-function relationships, macromolecular interactions, and their relevance to disease biology. Experience in experimental design, data collection, data processing, and structure determination. Excellent verbal and written communication skills, with the ability to convey complex structural biology concepts clearly. Preferred: Experience with computational structural biology tools (e.g., Rosetta, AlphaFold, molecular dynamics software). Previous experience working in a multidisciplinary research environment. Familiarity with AI/ML approaches in structural biology or drug discovery. Other Considerations: Fresh PhD graduates will be hired as Postdoctoral Fellows for a two-year term with an opportunity for promotion to Scientist based on performance. Candidates with at least two years of postdoctoral experience in structural biology in either academia or industry will be hired as Scientists with opportunities for career advancement.
Posted 1 week ago
1.0 - 5.0 years
20 - 25 Lacs
Pune
Work from Office
Engineer - Mechanical Design - EMH Crane & Components Job Description Requirements: Graduate Mechanical Engineer with 1 -5 years of experience in Design. Crane industry preferred, but not mandatory. Knowledge on basic mechanical & Structural design is a must. Familiar with the rules and regulations of the crane industry like IS:3177, IS:807 etc is desirable. Must have hands-on experience on 3D CAD software like Solid edge. Design Validation documentation and activities Basic knowledge of mechanical components and machining processes. Working experience on PDM/PLM software like team centre would be added advantage. Innovative & self-learner. Good communication skill. Roles & Responsibilities To work on Design & detail Engineering of EOT & other cranes as per Indian standard/Int STd. Design calculation in excel based tool & detail engineering using Solid Edge software. Providing technical support to marketing during tendering Documenting the offer design process. Innovative and incorporate new ways for efficient and quick offering methods. Should possess the aptitude for learning and self-development. Improvement of existing design for better maintainability, ease of manufacturing, cost reduction etc.
Posted 1 week ago
5.0 - 10.0 years
8 - 9 Lacs
Thiruvananthapuram
Work from Office
What you ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need? Bachelors degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart? Knowledge or experience with Apache Beam for stream and batch data processing. Familiarity with big data tools and technologies like Apache Kafka, Hadoop, or Spark. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to data visualization tools or platforms. Primary Location: IND-Trivandrum-Equifax Analytics-PEC Function: Function - Tech Dev and Client Services Schedule: Full time
Posted 1 week ago
10.0 - 15.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Youll lead the Data Science function supporting Lending, overseeing credit risk scoring models across PD, LGD, and collections. Youll guide the team in leveraging alternative data to improve model accuracy and signal. Youll lead the full model lifecycle driving strategy, standards, and execution across model development, validation, deployment, and monitoring. Youll partner with business, risk, and ops leaders to shape the credit roadmap and influence decisions with data-driven insights. You are experienced in leading teams while being hands-on when needed. This role is suited for professionals with 10+ years of experience in data science and risk analytics. You will report to Head of Data Science and this role is onsite based in Bangalore. The Critical Tasks You Will Perform Lead the team in building predictive models to separate good vs bad borrowers using ML and traditional methods Drive development of ML and deep learning models for loss estimation across portfolios Oversee model validation and performance monitoring across diverse data sets Guide feature engineering strategies and explore new external data sources Champion model governance in collaboration with risk, compliance, and audit teams Ensure timely identification of performance drifts and lead model refresh cycles Communicate modeling outcomes and trade-offs to senior leadership and key stakeholders Translate analytics into strategic levers policy, pricing, targeting, and credit expansion Set the vision for solving hard data science problems using best-in-class ML techniques Read more Skills you need The Essential Skills You Need Deep domain expertise in credit risk modelling across PD, LGD, EAD, and collections Prior experience working in FinTech/credit businesses, especially in digital lending or unsecured loan portfolios Proven track record of applying data science to credit underwriting, risk segmentation, and portfolio management Expertise in Python for ML model development, with experience building scalable, production-grade solutions Proficient in Spark, SQL, and large-scale distributed data processing frameworks Grasp of advanced ML concepts, including model interpretability, bias mitigation, and performance optimization Experience with ML/DL libraries (scikit-learn, XGBoost, TensorFlow/PyTorch) and guiding teams on their best use Working knowledge of MLOps and orchestration tools (Airflow, MLflow, etc.), with experience standardising model deployment pipelines Exposure to LLMs and Generative AI, with a perspective on their potential applications in credit risk Design of robust, reusable feature pipelines from structured and unstructured data sources Familiarity with Git, CI/CD, and model versioning frameworks as part of scaling DS delivery Mindset and ability to coach team members through complex modelling issues With experience aligning technical outputs with business strategy Read more What we offer About Grab and Our Workplace Grab is Southeast Asias leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, weve got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Read more Life at Grab Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave, and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through lifes challenges. What we stand for at Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. #LI-DNI Read more
Posted 1 week ago
5.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Join Team Amex and lets lead the way together. Responsible for contacting clients with overdue accounts to secure the settlement of the account. Also, they do preventive work to avoid future overdues with accounts that have a high exposure. The eCRMS organization is looking for a hands-on Software Engineer for Customer 360 (C360) Engineering team. The C360 Platform is a critical platform in American Express that provides a holistic view of the customers relationships with various American Express products and manages customers demographics and provides intelligent insights about customers contact preferences. This platform is an integral part of all critical user journeys and is at the fore front of all the new initiatives the company is undertaking. In C360, we build and operate highly available and scalable services using event-driven reactive architecture to provide real time services to power critical use cases across the company. We perform data analysis, work on anomaly detection and create new data insights. This role of an Engineer will be an integral part of a team that builds large-scale, cloud-native, event-driven reactive applications to create 360-degree view of the customer. Specifically, you will help: Lead build of new micro-services that help manages our rapidly growing data hub. Lead build of services to perform real time data processing at scale for relational, analytical queries across multi-dimensional data. Lead build of services that generalize stream processing to make it trivial to source, sink and stream process data. Improve efficiency, reliability and scalability of our data pipelines. Work on cross-functional initiatives and collaborate with Engineers across organizations. Influence team members with creative changes and improvements by challenging status quo and demonstrating risk taking Be a productivity multiplier for your team by analyzing your workflow and contributing to enable the team to be more effective, productive, and demonstrating faster and stronger results. Are you up for the challenge? 5+ years of experience in building large scale distributed applications with object-oriented design using java related stack. Holds a masters or bachelor s degree in Computer Science, Information Systems, or other related field (or has equivalent work experience). Ability to implement scalable, high performing, secure, highly available solutions. Proficient in developing solution architecture for business problems and communicating it to large teams. Proficient in weighing pros and cons of different solution options and gaining alignment on the preferred option with multiple stakeholders. Experience with NoSQL technologies such as Cassandra, Couchbase, etc. Experience with web services and API development on enterprise platforms using REST, GraphQL, and gRPC. Expertise in Big Data technologies like Hive, Map Reduce, and Spark Experience in Event Driven Microservice architecture Vert.X, Kafka, etc Experience with automated release management using Maven, Git, Jenkins. Experience with Vert.X and Event driven Architecture is a plus. Experience with Postgres is a plus. Experience with Docker/Openshift based deployment is a plus. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
id="job_description_2_0"> Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and weve set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. About the team The mission of Rokus Data Engineering team is to develop a world-class big data platform so that internal and external customers can leverage data to grow their businesses. Data Engineering works closely with business partners and Engineering teams to collect metrics on existing and new initiatives that are critical to business success. As Senior Data Engineer working on Device metrics, you will design data models & develop scalable data pipelines to capturing different business metrics across different Roku Devices. About the role Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and monetise large audiences, and provide advertisers with unique capabilities to engage consumers. Roku streaming players and Roku TV models are available around the world through direct retail sales and licensing arrangements with TV brands and pay-TV operators.With tens of million players sold across many countries, thousands of streaming channels and billions of hours watched over the platform, building scalable, highly available, fault-tolerant, big data platform is critical for our success.This role is based in Bangalore, India and requires hybrid working, with 3 days in the office. What youll be doing Build highly scalable, available, fault-tolerant distributed data processing systems (batch and streaming systems) processing over 10s of terabytes of data ingested every day and petabyte-sized data warehouse Build quality data solutions and refine existing diverse datasets to simplified data models encouraging self-service Build data pipelines that optimise on data quality and are resilient to poor quality data sources Own the data mapping, business logic, transformations and data quality Low level systems debugging, performance measurement & optimization on large production clusters Participate in architecture discussions, influence product roadmap, and take ownership and responsibility over new projects Maintain and support existing platforms and evolve to newer technology stacks and architectures Were excited if you have Extensive SQL Skills Proficiency in at least one scripting language, Python is required Experience in big data technologies like HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, etc. Proficiency in data modeling, including designing, implementing, and optimizing conceptual, logical, and physical data models to support scalable and efficient data architectures. Experience with AWS, GCP, Looker is a plus Collaborate with cross-functional teams such as developers, analysts, and operations to execute deliverables 5+ years professional experience as a data or software engineer BS in Computer Science; MS in Computer Science preferred The Roku Culture We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isnt real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how weve grown, visit https: / / www.weareroku.com / factsheet .
Posted 1 week ago
13.0 - 18.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Job Role Value Proposition The MetLife Corporate Technology (CT) organization is evolving to enable MetLife s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. As part of Tech Talent Transformation (T3) agenda, MetLife is establishing a Technology Center in India. This technology center will perform as an integrated organization between onshore, offshore, and strategic vendor partners in an Agile delivery model. The US Actuarial Technology Delivery Lead will be part of the larger Actuarial, Reinsurance, and Treasury (ART) leadership team. As a key technical leader, you will design, develop, and maintain scalable data pipelines, implement efficient data solutions, and drive data-driven strategies for our organization. Your expertise in Azure, Databricks, Scala, and big data technologies will be critical in optimizing data flows and empowering analytics initiatives. You will closely collaborate with actuarial teams to deliver data-driven insights and solutions, supporting risk modeling, pricing strategies, and other key business initiatives. Key Relationships Internal Stake Holder - Corporate Technology ART Leader, ART Leadership team, India Corporate Technology AVP, and Business process Owners for US Actuarial Key Responsibilities Lead the design, development, and maintenance of scalable data architectures and pipelines in Azure Cloud environments. Develop strategies for the integration of actuarial data sources, both structured and unstructured, to enable advanced analytics. Build and optimize data models and ETL/ELT processes using Databricks, Scala, and Spark. Ensure the integrity, security, and quality of data through robust data governance practices. Implement performance tuning strategies to optimize big data processing workflows. Develop automation frameworks and CI/CD pipelines for data pipeline deployment. Lead and mentor junior data engineers, providing technical guidance and support. Stay up to date with emerging technologies in the data ecosystem and drive innovation. People Management - Managing data engineers located in multiple countries, upskilling/reskilling, hiring / retention agenda and adoption of engineering culture. Stakeholder Management - Managing key business stakeholders to deliver the required technology capabilities to support the digital transformation agenda. Driving prioritization of the product backlog. This includes managing key vendors providing the resources, SaaS & other capabilities. Ways of Working - Adoption of the Agile ways of working in the software delivery lifecycle. E2E Software Lifecycle Management (Architecture, Design, Development, Testing & Production) Driving the decisions and implementation of technology best practices, architecture and technology stack including adoption of cloud, AI/Data, and other relevant emerging technology that is fit-for-purpose. Conduct peer reviews of solution designs and related code, ensuring adherence to best practice guidelines and compliance with security and privacy mandates. Investigate and resolve escalated production management incidents, problems, and service requests. Ensure disaster recovery, privacy, and security are aligned to enable application/platform stability, including technology currency management. Partner with Cyber & IT Security, Data Analytics, Infrastructure, Global Risk Management, Compliance, and Internal Audit to provide the holistic provision of technology services including risks, controls and security, to the business Education A Bachelors/masters degree in computer science or equivalent field. Experience 13+ years experience leading software engineering teams Proven experience (5+ years) in data engineering, big data processing, and distributed systems. Proficiency in Azure services (Data Factory, Azure Synapse, Azure Data Lake, Event Hubs). Strong hands-on experience with Databricks and Spark for big data processing. Expertise in Scala programming for data processing workflows. Deep understanding of ETL/ELT processes, data modeling, and data warehousing principles. Familiarity with cloud-native architectures and CI/CD pipelines. Experience with data governance, security, and compliance practices. Excellent problem-solving and communication skills. Strong leadership and mentoring abilities. Proficiency in software development lifecycle including CI/CD, test driven development, domain driven design and Agile ways of working. Proven track record in partnering with the business to deliver mission critical transformation via Agile approach. Preferred Skills: Knowledge of additional programming languages like Python and SQL. Familiarity with actuarial tools such as Prophet or other risk modeling systems. Experience with DevOps tools and practices (Azure DevOps, Terraform). Understanding of machine learning workflows and MLOps practices. Proficiency in data visualization tools (Power BI). Soft Skills: Strong leadership and project management capabilities. Excellent problem-solving, communication, and stakeholder management skills. Ability to balance technical innovation with business value delivery. Skills and Competencies: Competencies Communication: Ability to influence and help communicate the organization s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment People Management: Inspiring, motivating and leading diverse and distributed teams Diverse environment: Can-do attitude and ability to work in a high paced environment Technical Stack: Programming: Python, Scala, and SQL for data transformation and analytics workflows. Azure Services: Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Storage, and Event Hubs. Databricks Platform: Strong knowledge of cluster management, performance tuning, job scheduling, and advanced analytics with Spark. Big Data Tools: Apache Spark, Delta Lake, and distributed computing concepts. Data Security: Security best practices including RBAC, encryption, and GDPR compliance strategies. CI/CD: DevSecOps for DataOps using GitHub Actions, Azure DevOps, or similar tools for automation. Data Governance: Knowledge of data quality frameworks, lineage tracking, and metadata management. Cloud Infrastructure: Azure networking, IAM, and infrastructure monitoring. Certifications Azure Data Engineer / Azure AI Engineer Azure Architect Data Bricks - Azure platform Architect
Posted 1 week ago
2.0 - 6.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Were Celonis, the global leader in Process Mining technology and one of the worlds fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. The Team: Our team is responsible for building the Celonis end-to-end Task Mining solution . Task Mining is the technology that allows businesses to capture user interaction (desktop) data, so they can analyze how people get work done, and how they can do it even better. We own all the related components, e.g. the desktop client, the related backend services, the data processing capabilities, and Studio frontend applications. The Role: Celonis is looking for a Senior Software Engineer to build new features and increase the reliability of our Task Mining solution. You would contribute to the development of our Task Mining Client so expertise on C# and .NET framework is required and knowledge of Java and Spring boot is a plus. The work you ll do: Implement highly performant and scalable desktop components to improve our existing Task Mining software Own the implementation of end to end solutions: leading the design, implementation, build and delivery to customers Increase the maintainability, reliability and robustness of our software Continuously improve and automate our development processes Document procedures, concepts, and share knowledge within and across teams Manage complex requests from support, finding the right technical solution and managing the communication with stakeholders Occasionally work directly with customers, including getting to know their system in detail and helping them debug and improve their setup. The qualifications you need: 2-6 years of professional experience building .NET applications Passion for writing clean code that follows SOLID principles Hand-on experience in C# and .NET framework. Experience in user interface development using WPF and MVVM. Familiarity with Java, Spring framework is a plus. Familiarity with containerization technologies (i.e. Docker) Experience in REST APIs and/or distributed micro service architecture Experience in monitoring and log analysis capabilities (i.e. DataDog) Experience in writing and setting up unit and integration tests Experience in refactoring legacy components. Able to supervise and coach junior colleagues Experience interacting with customers is a plus. Strong communication skills. What Celonis Can Offer You: Pioneer Innovation: Work with the leading, award-winning process mining technology, shaping the future of business. Accelerate Your Growth: Benefit from clear career paths, internal mobility, a dedicated learning program, and mentorship opportunities. Receive Exceptional Benefits: Including generous PTO, hybrid working options, company equity (RSUs), comprehensive benefits, extensive parental leave, dedicated volunteer days, and much more . Prioritize Your Well-being: Access to resources such as gym subsidies, counseling, and well-being programs. Connect and Belong: Find community and support through dedicated inclusion and belonging programs. Make Meaningful Impact: Be part of a company driven by strong values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future. Collaborate Globally: Join a dynamic, international team of talented individuals. Empowered Environment: Contribute your ideas in an open culture with autonomous teams. About Us: Celonis makes processes work for people, companies and the planet. The Celonis Process Intelligence Platform uses industry-leading process mining and AI technology and augments it with business context to give customers a living digital twin of their business operation. It s system-agnostic and without bias, and provides everyone with a common language for understanding and improving businesses. Celonis enables its customers to continuously realize significant value across the top, bottom, and green line. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video . Celonis Inclusion Statement: At Celonis, we believe our people make us who we are and that The Best Team Wins . We know that the best teams are made up of people who bring different perspectives to the table. And when everyone feels included, able to speak up and knows their voice is heard - thats when creativity and innovation happen. Your Privacy: Any information you submit to Celonis as part of your application will be processed in accordance with Celonis Accessibility and Candidate Notices By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process . Please be aware of common job offer scams, impersonators and frauds. Learn more here .
Posted 1 week ago
2.0 - 4.0 years
2 - 2 Lacs
Jaipur, Bhankrota
Work from Office
1.Enter and update data in Excel and internal systems. 2.Verify accuracy and resolve discrepancies. 3.Maintain organized data records. 6 Coordinate with WH teams for data collection and validation
Posted 1 week ago
0.0 years
1 - 1 Lacs
Viluppuram
Work from Office
Greetings from Annexmed!!! Huge Openings for Data Analyst - Non-Voice Process (Freshers)- Villupuram Desired Skill: * Typing Skill (Upper / Lower) * Qualification: Diploma or Any Degree * Passed Out Year 2022 To 2025. * Good Communication Skill. * Location: Candidates Must Reside Within 15Kms Radius Of The Office Location. Interview Time : 11:00AM to 4:00PM Interview Day: Monday to Friday working days: 5days only Sat & Sunday fixed leave Contact : Geetha HR 8220529346 Shift : Night shift only (9.30PM to 5.30AM)
Posted 1 week ago
2.0 - 5.0 years
4 - 7 Lacs
Navi Mumbai
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 week ago
2.0 - 5.0 years
4 - 7 Lacs
Mumbai
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modelling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 1 week ago
0.0 - 2.0 years
1 - 4 Lacs
Gurugram, Raipur, Mumbai (All Areas)
Work from Office
Roles and Responsibilities JOB DESCRIPTION It is a non Voice and Voice process To be able to deal with customers and meet customer requirements. Candidate should be ready to work Desired Candidate Profile Industry: BPO / Call Centre / ITES Functional Area: ITES , BPO , KPO , LPO , Customer Service , Operations Role Category: Voice Role: Associate/ Senior Associate - (NonTechnical) Desired Candidate Profile Education qualification: Graduate in any discipline/ Under graduates/ Dropout. Perks and Benefits Domestic and International call Center WhatsApp number 9781021114 No Fees Call 9988350971 01725000971 7508062612 9988353971 Age Limit 18 to 32 12th or Graduate any degree or diploma can apply Salary 15000 to 35000 and incentive 1 lakh
Posted 1 week ago
8.0 - 13.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Date Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Could you be the full-time **Data Solutions Manager** in **[Insert Location]** were looking for Take on a new challenge and apply your **data science and technical leadership** expertise in a cutting-edge field. Youll work alongside **collaborative and innovative** teammates. You'll play a pivotal role in shaping and sustaining advanced data solutions that drive our industrial programs. Day-to-day, youll work closely with teams across the business (**engineering, IT, and program management**), **define and develop scalable data solutions**, and much more. Youll specifically take care of **designing production-grade, cyber-secure data solutions**, but also **applying AI techniques to enhance data utilization for key indicators**. Well look to you for: Managing the team to ensure technical excellence and process adherence Designing scalable, multi-tenant data collectors and storage systems Building streaming and batch data processing pipelines Developing SQL and NoSQL data models Assessing and enhancing the quality of incoming data flows Applying advanced AI techniques and data management/security components Creating customizable analytical dashboards Evaluating opportunities presented by emerging technologies Implementing strong testing and quality assurance practices All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Engineering degree or equivalent 8+ years of experience in IT, digital companies, software, or startups Proficiency in data processing and software development using tools like QlikSense, PowerApps, Power BI, or Java/Scala Experience with Apache Spark and other data processing frameworks Strong statistical skills (e.g., probability theories, regression, hypothesis testing) Expertise in machine learning techniques and algorithms (e.g., Logistic Regression, Decision Trees, Clustering) Proficiency in data science methods (CRISP-DM, feature engineering, model evaluation) Experience with Python and R libraries (NumPy, Pandas, Scikit) Deep knowledge of SQL database configuration (e.g., Postgres, MariaDB, MySQL) Familiarity with DevOps tools (e.g., Docker, Ansible) and version control (e.g., Git) Knowledge of cloud platforms (Azure, AWS, GCP) is a plus Understanding of network and security principles (e.g., SSL, certificates, IPSEC) Fluent in English; French is a plus Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines Work with cutting-edge security standards for rail data solutions Collaborate with transverse teams and supportive colleagues Contribute to innovative and impactful projects Utilise our **flexible and inclusive** working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development through award-winning learning opportunities Progress towards leadership roles in data science and digital transformation Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension) You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 1 week ago
4.0 - 8.0 years
25 - 30 Lacs
Pune
Hybrid
So, what’s t he r ole all about? As a Data Engineer, you will be responsible for designing, building, and maintaining large-scale data systems, as well as working with cross-functional teams to ensure efficient data processing and integration. You will leverage your knowledge of Apache Spark to create robust ETL processes, optimize data workflows, and manage high volumes of structured and unstructured data. How will you make an impact? Design, implement, and maintain data pipelines using Apache Spark for processing large datasets. Work with data engineering teams to optimize data workflows for performance and scalability. Integrate data from various sources, ensuring clean, reliable, and high-quality data for analysis. Develop and maintain data models, databases, and data lakes. Build and manage scalable ETL solutions to support business intelligence and data science initiatives. Monitor and troubleshoot data processing jobs, ensuring they run efficiently and effectively. Collaborate with data scientists, analysts, and other stakeholders to understand business needs and deliver data solutions. Implement data security best practices to protect sensitive information. Maintain a high level of data quality and ensure timely delivery of data to end-users. Continuously evaluate new technologies and frameworks to improve data engineering processes. Have you got what it takes? 8-11 years of experience as a Data Engineer, with a strong focus on Apache Spark and big data technologies. Expertise in Spark SQL , DataFrames , and RDDs for data processing and analysis. Proficient in programming languages such as Python , Scala , or Java for data engineering tasks. Hands-on experience with cloud platforms like AWS , specifically with data processing and storage services (e.g., S3 , BigQuery , Redshift , Databricks ). Experience with ETL frameworks and tools such as Apache Kafka , Airflow , or NiFi . Strong knowledge of data warehousing concepts and technologies (e.g., Redshift , Snowflake , BigQuery ). Familiarity with containerization technologies like Docker and Kubernetes . Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. Excellent communication and collaboration skills to work effectively with cross-functional teams. You will have an advantage if you also have: Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7235 Reporting into: Tech Manager Role Type: Individual Contributor
Posted 1 week ago
5.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Design, develop, and maintain scalable data processing applications using Spark and PySpark API Development 5+ years of experience in at least one of the following: Java, Spark, scala, Python API Development expertise. Write efficient, reusable, and well-documented code. Design and implement data pipelines using tools like Spark and PySpark. Strong analytical and problem-solving abilities to address technical challenges. Perform code reviews and provide constructive feedback to improve code quality. Design and implement data processing tasks that integrate with SQL databases. Proficiency in data modeling, data lake, lakehouse, and data warehousing concepts. Experience with cloud platforms like AWS
Posted 1 week ago
2.0 - 6.0 years
0 - 2 Lacs
Jamnagar
Work from Office
Looking for female candidates with experience in Microsoft Excel, online research, and data management. Candidates should be capable of delivering results with minimal supervision.
Posted 1 week ago
5.0 - 10.0 years
8 - 10 Lacs
Chennai, Tamil Nadu, India
On-site
Google Cloud Platform GCS, DataProc, Big Query, Data Flow Programming Languages Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform, GCS, DataProc, Big Query, Data Flow, Composer, Data Processing, Java
Posted 1 week ago
0.0 years
1 - 3 Lacs
Ahmedabad
Work from Office
Ready to shape the future of work? At Genpact, we don't just adapt to change we drive it. AI and digital innovation are redefining industries and were leading the charge. Genpacts AI Gigafactory, our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook. Mega Virtual Drive for Customer Service roles -English+ Hindi Language on 18th June 2025 (Wednesday) || Ahmedabad Location Date: 18-June-2025 (Wednesday) MS Teams meeting ID: 462 682 166 152 4 MS Teams Passcode: ko7zJ3hp Time: 12:00 PM - 1:00 PM Job Location: Ahmedabad (Work from office) Languages Known: Hindi + English Shifts: Flexible with any shift Responsibilities • Respond to customer queries and customer's concern • Provide support for data collection to enable Recovery of the account for end user. • Maintain a deep understanding of client process and policies • Reproduce customer issues and escalate product bugs • Provide excellent customer service to our customers • You should be responsible to exhibit capacity for critical thinking and analysis. • Responsible to showcase proven work ethic, with the ability to work well both independently and within the context of a larger collaborative environment Qualifications we seek in you Minimum qualifications • Graduate (Any Discipline except law) • Only Freshers are eligible • Fluency in English & Hindi language is mandatory Preferred qualifications • Effective probing skills and analyzing / understanding skills • Analytical skills with customer centric approach • Excellent proficiency with written English and with neutral English accent • You should be able to work on a flexible schedule (including weekend shift) Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. **Note: Please keep your E-Aadhar card handy while appearing for interview.
Posted 1 week ago
8.0 - 13.0 years
30 - 35 Lacs
Gurugram
Work from Office
We are looking for a highly skilled and experienced Senior Engineer with a history of building Bigdata, GCP Cloud, Python and Spark applications. The Senior Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organizations data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Joining the Enterprise Marketing team, this role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. We pride ourselves on a culture of kindness and positivity, and a continuous focus on supporting colleague development to help you achieve your career goals. We lead with integrity, and we emphasize work/life balance for all of our teammates. How will you make an impact in this role There are hundreds of opportunities to make your mark on technology and life at American Express. Heres just some of what you'll be doing: As a part of our team, you will be developing innovative, high quality, and robust operational engineering capabilities. Develop software in our technology stack which is constantly evolving but currently includes Big data, Spark, Python, Scala, GCP, Adobe Suit ( like Customer Journey Analytics ). Work with Business partners and stakeholders to understand functional requirements, architecture dependencies, and business capability roadmaps. Create technical solution designs to meet business requirements. Define best practices to be followed by team. Taking your place as a core member of an Agile team driving the latest development practices Identify and drive reengineering opportunities, and opportunities for adopting new technologies and methods. Suggest and recommend solution architecture to resolve business problems. Perform peer code review and participate in technical discussions with the team on the best solutions possible. Minimum Qualifications : BS or MS degree in computer science, computer engineering, or other technical discipline, or equivalent work experience. 8+ years of hands-on software development experience with Big Data Analytics solutions Hadoop Hive, Spark, Scala, Hive, Python, shell scripting, GCP Cloud Big query, Big Table, Airflow. Working knowledge of Adobe suit like Adobe Experience Platform, Adobe Customer Journey Analytics Proficiency in SQL and database systems, with experience in designing and optimizing data models for performance and scalability. Design and development experience with Kafka, Real time ETL pipeline, API is desirable. Experience in designing, developing, and optimizing data pipelines for large-scale data processing, transformation, and analysis using Big Data and GCP technologies. Certifications in cloud platform (GCP Professional Data Engineer) is a plus. Understanding of distributed (multi-tiered) systems, data structures, algorithms Design Patterns. Strong Object-Oriented Programming skills and design patterns. Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git, Maven). Good knowledge and experience with configuration management tools like GitHub Ability to analyze complex data engineering problems, propose effective solutions, and implement them effectively. Looks proactively beyond the obvious for continuous improvement opportunities. Communicates effectively with product and cross functional team. Willingness to learn new technologies and leverage them to their optimal potential. Understanding of various SDLC methodologies, familiarity with Agile scrum ceremonies. We back you with benefits that support your holistic we'll-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as we'll as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-we'll-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities
Posted 1 week ago
0.0 - 3.0 years
1 - 4 Lacs
Ahmedabad
Work from Office
We are looking for a passionate and motivated individual to join our team as a Data Engineering Intern / Junior Data Engineer . This role is ideal for someone who has a foundational understanding of data concepts and is eager to learn and grow in the data engineering domain. Key Responsibilities: Assist in the development and maintenance of data pipelines Work on data extraction, transformation, and loading (ETL) tasks Support the design and development of data warehouses and/or data lakes Collaborate with senior engineers and analysts to understand data requirements Write basic SQL queries and Python scripts for data processing Participate in data validation and quality checks Document data processes and workflows Basic Requirements: Basic understanding of SQL and Python Familiarity with ETL concepts and data warehouse/lake terminology Interest in cloud platforms like Azure , AWS , or Google Cloud (GCP) Strong analytical and problem-solving skills Eagerness to learn and take initiative in a fast-paced environment Excellent communication and presentation skills Good to Have (Bonus): Exposure to tools like Azure Data Factory , Databricks , or Power BI Academic or self-initiated projects related to data engineering Knowledge of version control systems like Git Why Join Us Hands-on learning with modern data tools and technologies Work in a supportive and collaborative environment Gain real-world experience in data engineering Opportunity for full-time employment based on performance
Posted 1 week ago
18.0 - 20.0 years
25 - 30 Lacs
Bengaluru
Work from Office
This job will drive the strategic vision and development of cutting-edge machine learning models and algorithms to solve complex problems. You will work closely with data scientists, software engineers, and product teams to enhance services through innovative AI/ML solutions. Your role will involve building scalable ML pipelines, ensuring data quality, and deploying models into production environments to drive business insights and improve customer experiences. Essential Responsibilities: Define and drive the strategic vision for machine learning initiatives at the organizational level. Lead the development and optimization of state-of-the-art machine learning models. Oversee the preprocessing and analysis of large datasets. Deploy and maintain ML solutions in production environments. Collaborate with cross-functional teams to integrate ML models into products and services. Monitor and evaluate the performance of deployed models, making necessary adjustments. Mentor and guide junior engineers and data scientists. Publish research findings and contribute to industry discussions. Represent the company at conferences and external engagements. Influence the direction of the companys AI/ML strategy and contribute to long-term planning. Minimum Qualifications: Minimum of 18 years of relevant work experience and a Bachelors degree or equivalent experience. Deep expertise with ML frameworks like TensorFlow, PyTorch, or scikit-learn. Extensive experience with cloud platforms (AWS, Azure, GCP) and tools for data processing and model deployment. Recognized as an industry expert with a strong publication record. Subsidiary: PayPal Travel Percent: 20
Posted 1 week ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
REPORTING TO : Director - Market Research EXPERIENCE : 2 - 5 years of managerial experience in MR EDUCATION : Any Graduate - Any graduate with PG in management BROAD RESPONSIBILITIES: The key face for the customer, managing clients, issues, and engagement through the course of projects. Provide clients with solutions related to projects with focus on nurturing client relationships The Project Manager is responsible for the management and implementation of the internal business processes involved in the collection and preparation of market research data; from the provision of the questionnaire to the delivery of the final deliverables. The Project Manager is responsible for a broad range of complex market research, administrative, and technical activities. The position requires management and facilitation among the various internal departments and external agencies involved in the project. With minimal consultation from a senior team member, manages all aspects of the internal business process all administrative tasks project initiation project scheduling managing all field activities to ensure that projects are completed within established budgets, parameters and schedule monitoring field status reports understanding the data requirements checking data and proofreading deliverables coordinating with internal and external departments/agencies Project management, client management, client servicing, excellent communication skills Qualifications Experience in MR Data collection and consulting Study setup Scheduling projects: Planning what happens when, negotiating where necessary; preparing detailed instructions on studies to ensure studies are correctly administered; and personal briefings to Survey Programming, Data Collection, and Data Processing when appropriate. Questionnaire input and specs Liaising with client; with suggestions/ recommendations for effective data collection. Project handling: Monitoring job progress and providing feedback; ensuring deadlines are met and taking corrective action where necessary; liaising with client service and data processing on code frames and DP specs; handling project-related queries from data collection or data processing post field; ensuring work meets quality standards. Quality control: Project Managers keep in touch with data collection and data processing or suppliers during the project so that problems can be rectified early in the job cycle. Adhere to all processes/standards to ensure Quality. Job analysis: Analyzing the performance on the study and providing recommendations to client service and operations for the future. Additional Information Ability to handle multiple tasks and meet assigned deadlines within extremely short timeframes Strong attention to detail and accuracy Excellent analytical, computational, and problem solving skills Excellent interpersonal and negotiation skills Strong written and verbal communication skills with the ability to effectively interact with internal and external clients. Ability to explore solutions to complex project situations within tight deadlines Strong computer skills including Microsoft Office and/Suite and other specialized business-related software systems Enthusiastic with Good people skills. Enjoy working with and talking to people, and be open and interested in new ideas and ways of doing things Should be willing to work in night shifts (6:00 PM to 3:00 AM)
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
RELEVANT EXPERIENCE: Minimum 2 Years INDUSTRY EXPERIENCE: Market Research The Analyst role focuses on providing solutions to clients utilizing a variety of data processing tools and advanced dashboard software. A successful candidate has a strong understanding of primary and secondary data deliverables, especially reporting, is collaborative, self-motivated and client focused. Has an ability to master new tools and processes quickly. Responsibilities of the Role Thorough review of client materials and preparation of queries in a timely manner. Application of internal procedures and methodologies to create client facing dashboards. Ability to detect illogical results and capacity to communicate them in a professional manner -with a focus on solutions. Deliver assigned tasks in timely and high-quality manner, resulting in high client satisfaction. (Deliverables include dashboards, custom data and advance analytics.) Expected to spend time reporting and receiving information to and from their manager. Commitment to learn the latest technologies in the data collection field. Requirements Essential: Bachelor s or Master s degree in Computer Science (B.E./B. Tech/MSc/MCA) EXCEPTIONAL attention to detail, and commitment to the accuracy and integrity of information Exceptional ability to multitask and balance multiple projects and priorities Strong problem-solving skills, including an ability to think outside the box Good understanding of databases Basic statistic knowledge Full professional proficiency in English both written and spoken Desirable: MS Office experience (Excel, Word, PPT) Coding Experience in programming language Market research experience (preferably with healthcare surveys) Experience creating reports Personal Qualities: Organized, self-motivated and self-directed Ingenuity, with a willingness and enthusiasm for learning new techniques Excellent interpersonal communication skills Excellent at establishing and managing relationships with clients, vendors, and coworkers Logical thinking skills Qualifications Bachelor s or Master s degree in Computer Science (B.E./B. Tech/MSc/MCA)
Posted 1 week ago
4.0 - 5.0 years
6 - 7 Lacs
Bengaluru
Work from Office
The incumbent will be responsible for timely project delivery with high quality interaction with the client over telephone / email. Drive and achieve internal targets. Requirement minimum hands on experience working in a Market Research using Quantum. Graduate / Post graduate with experience with excellent communication skills. Good knowledge of MS Office and Data processing tools like Quantum, SPSS, Quanvert. Attention to detail should have good analytical skills. Should be able to handle multiple projects simultaneously. Able to prioritize work according to pre-set timelines. Ability to work on Tabulations, Data Edits, Data Validation and other Data Processing activities. Reporting Structure Will be reporting to the Team Leader Communication should be good and willing to work in shifts. As part of job responsibilities, you are required to comply with ISO 20252:2019 and ISO 27001 standards . Knowledge, Skill, Ability: Experience with SPSS Ability to learn quickly Ability to communicate effectively Willing to work in US EST Shift (6pm - 3am) Qualifications Computer Science Graduate
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France