Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
12 - 23 Lacs
Delhi, India
On-site
Description We are looking for an experienced NLP Engineer to join our dynamic team in India. The ideal candidate will have a strong background in natural language processing and machine learning, with a passion for building innovative solutions that leverage language data. Responsibilities Design and implement NLP models and algorithms for various applications. Collaborate with data scientists and software engineers to integrate NLP features into products. Conduct research on state-of-the-art NLP methods and apply them to solve real-world problems. Analyze large datasets to extract meaningful insights and improve model performance. Optimize and fine-tune models for better accuracy and efficiency. Skills and Qualifications 4-8 years of experience in Natural Language Processing or related fields. Strong programming skills in Python and familiarity with NLP libraries such as NLTK, spaCy, or Hugging Face Transformers. Experience with machine learning frameworks such as TensorFlow or PyTorch. Solid understanding of statistical analysis and data mining techniques. Knowledge of deep learning architectures for NLP, including RNNs, LSTMs, and Transformers. Experience in text preprocessing, feature extraction, and model evaluation metrics. Familiarity with cloud services (AWS, GCP, Azure) for deploying NLP solutions.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
We are looking for a highly motivated and experienced Senior Software Engineer to join our team and contribute significantly to the development and enhancement of our cutting-edge options analytics platform. Your primary responsibilities will involve designing, developing, and implementing robust and scalable Java-based solutions for calculating and analyzing options pricing models and risk metrics. The ideal candidate will have a solid understanding of financial markets, options theory, and a proven track record of building high-performance, data-driven applications in Java. You will be involved in: - Designing, developing, and maintaining Java-based components for our options analytics platform, including pricing models, risk calculations (Greeks, VaR, etc.), and data processing pipelines. - Implementing and optimizing complex algorithms for option pricing and risk analysis to ensure accuracy and performance. - Collaborating with product managers and stakeholders to understand requirements and translate them into technical solutions. - Writing clean, well-documented, and testable code following best practices. - Participating in code reviews and contributing to improving the team's development processes. - Troubleshooting and debugging issues to ensure the stability and reliability of the platform. - Staying up-to-date with the latest advancements in options pricing models, financial markets, and Java technologies. - Contributing to the architecture and design of the overall system. - Mentoring junior engineers and providing technical guidance. Requirements: - Bachelor's or Master's degree in Computer Science, Financial Engineering, or a related field. - 5+ years of experience in software development, with a focus on Java. - Strong understanding of object-oriented programming principles and design patterns. - Proven experience in building and optimizing high-performance, multi-threaded applications. - Solid understanding of financial markets, options theory, and derivative pricing models. - Experience with numerical methods and algorithms used in options pricing and risk management. - Proficiency in working with large datasets and data processing techniques. - Experience with testing frameworks (e.g., JUnit, Mockito) and continuous integration/continuous deployment (CI/CD) pipelines. - Experience in building distributed systems and APIs. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills. Join us at Trading Technologies, a Software-as-a-Service (SaaS) technology platform provider to the global capital markets industry. Our award-winning TT platform connects to major international exchanges and liquidity venues, delivering advanced tools for trade execution, order management, market data solutions, analytics, trade surveillance, risk management, and infrastructure services. Be a part of our forward-thinking, culture-based organization with collaborative teams that promote diversity and inclusion.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a GCP Data Engineer with expertise, you will be responsible for managing, maintaining, and troubleshooting cloud data pipelines. The ideal candidate should have over 5 years of industry experience in Data Engineering support and enhancement. You will need to be proficient in any Cloud Platform services (GCP, Azure, AWS, etc.) and have a strong understanding of data pipeline architectures and ETL processes. Your role will involve leveraging your excellent Python programming skills for data processing and automation, along with SQL query writing skills for data analysis and experience with relational databases. Additionally, familiarity with version control systems like Git is required. Your responsibilities will include analyzing, troubleshooting, and resolving complex data pipeline issues. You will utilize your software engineering experience to optimize data pipelines, improve performance, and enhance reliability. It is essential to continuously optimize data pipeline efficiency, reduce operational costs, automate repetitive tasks in data processing, and monitor and alert for Data Pipelines. You will be expected to perform SLA-oriented monitoring for critical pipelines and implement improvements post-business approval for SLA adherence if needed. Moreover, your role will involve monitoring the performance and reliability of data pipelines, Informatica ETL workflows, MDM, and Control-M jobs. Conducting post-incident reviews, implementing improvements for data pipelines, and developing/maintaining documentation for data pipeline systems and processes are crucial aspects of the job. Experience with Data Visualization using Google Looker Studio, Tableau, Domo, Power BI, or similar tools is considered an added advantage. To qualify for this position, you should possess a Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience. Holding any Cloud Professional Data Engineer certification will be an added advantage. Excellent verbal and written communication skills are necessary for effective collaboration and documentation. Strong problem-solving and analytical skills are key to addressing challenges in data engineering. TELUS Digital is an equal opportunity employer committed to creating a diverse and inclusive workplace that values merit, competence, and performance without regard to any characteristic related to diversity.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
About Salesforce: At Salesforce, we are known as the Customer Company, leading the future of business by combining AI, data, and CRM technologies. We are committed to helping companies in various industries innovate and connect with customers in new and meaningful ways. As a Trailblazer at Salesforce, you are encouraged to drive your performance, chart new paths, and contribute to the betterment of our world. If you believe in the power of business as a force for positive change and in the importance of businesses doing well while also doing good, then you have found the right place to thrive. About the Role: The company is currently looking for a Forward Deployed Engineer - Deployment Strategist to fill a crucial hybrid position that combines technical expertise with strategic problem-solving skills. In this role, you will be responsible for deploying AI-powered solutions on the Salesforce platform with a focus on driving business impact and adoption. As a trusted advisor, you will bridge the gap between customer requirements and product innovation to ensure long-term success and value realization. You will lead a team of Forward Deployed Engineers, oversee deployments, foster collaboration, set goals, and address challenges. Additionally, you will play a key role in connecting customer needs with product development, providing field insights to influence enhancements and accelerate the product roadmap, thereby keeping the Agentforce platform at the forefront of AI solutions. A successful Forward Deployed Engineer - Deployment Strategist will have a deep understanding of our customers" most complex problems and will be adept at crafting and deploying innovative solutions that leverage our Agentforce platform and beyond. Your Impact: Strategic Solution Architecture & Design: Lead the analysis, design, and hands-on implementation of intelligent AI-powered agents within Salesforce environments, utilizing a range of technologies including Agentforce, Data Cloud, Flow, Lightning Web Components (LWC), Apex, and Salesforce APIs. Translate complex business challenges into actionable technical requirements and strategic deployment plans. AI & Data Mastery for Impact: Take ownership of the end-to-end data landscape, creating robust data models, developing efficient processing pipelines, and establishing seamless integration strategies. Employ advanced AI orchestration frameworks and engineering techniques to build sophisticated conversational AI solutions that optimize data for AI applications. Full-Lifecycle Deployment & Optimization: Oversee the successful deployment of solutions, ensuring seamless integration with existing customer infrastructure. Continuously monitor performance, identify bottlenecks, and implement optimizations to enhance reliability, scalability, and security. Entrepreneurial Execution & Rapid Prototyping: Operate with a mindset focused on rapid prototyping, iterative development, and timely delivery of impactful solutions. Adapt quickly to evolving customer priorities and technical challenges in dynamic environments. Trusted Technical & Strategic Partner: Collaborate closely with client teams to understand their operational challenges and strategic objectives. Act as a primary technical advisor, providing expert guidance and presenting results that drive measurable value and adoption. Product Evolution & Feedback Loop: Act as a crucial feedback loop between customers and internal product/engineering teams to influence future product enhancements. Provide insights that shape the strategic direction of the platform and contribute to broader product improvements. Business Process Transformation: Analyze existing business processes and identify automation opportunities through intelligent agents. Guide customers through process transformation and reengineering to drive efficiency and effectiveness. Team Leadership in Deployment Execution: Lead a team of peers in executing deployment initiatives, providing technical guidance, promoting collaboration, and ensuring successful project delivery. Required Qualifications: - 5+ years of hands-on experience in solutioning, including design, implementation, and testing of cloud-based technologies - Proficiency in Salesforce platform components like Flow, Lightning Web Components (LWC), and Salesforce APIs - Hands-on experience with AI/LLM technologies - Strong background in data modeling, processing, integration, and analytics with expertise in data platforms - Exceptional problem-solving skills in unstructured environments - Demonstrated entrepreneurial spirit and focus on customer impact - Excellent communication and collaboration skills - Proven team leadership experience - Prior customer-facing experience in a technical role - Willingness to travel as needed Preferred Qualifications: - Experience with Salesforce Data Cloud and/or Agentforce platform - Background in developing conversational AI solutions in regulated industries - Proficiency in programming languages like JavaScript, Java, Python, or Apex - Salesforce platform certifications - Knowledge of Salesforce CRM components - Experience with AI/ML concepts beyond LLMs - Bonus points for deploying solutions in customer environments,
Posted 1 week ago
5.0 - 8.0 years
5 - 10 Lacs
Pune, Maharashtra, India
On-site
Position summary: We are seeking a Senior Software Development Engineer - Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Roles Responsibilities: Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake. Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP). Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data. Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency. Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization. Implement data governance, security, and compliance best practices. Build and maintain data models, transformations, and data marts for analytics and reporting. Collaborate with data scientists, analysts, and business teams to define data engineering requirements. Automate infrastructure and deployments using Terraform, Airflow, or dbt. Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks. Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools. Basic Qualifications Bachelor s or Master s Degree in Computer Science or Data Science 5-8 years of experience in data engineering, big data processing, and cloud-based data platforms. Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks. Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization). Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine). Proficiency in SQL, Python, or Scala for data transformation and analytics. Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg). Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery). Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster. Strong understanding of data governance, access control, and encryption strategies. Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Experience with machine learning feature engineering pipelines in Databricks. Contributions to open-source data engineering projects.
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
As an experienced Power BI Architect with extensive knowledge of Microsoft Fabric, you will be responsible for leading the design, development, and implementation of innovative Business Intelligence (BI) solutions. Your expertise in enterprise data architecture, analytics platforms, and data integration strategies will be crucial in optimizing data pipelines and driving performance and scalability through the effective use of Power BI and Microsoft Fabric. Your key responsibilities will include developing comprehensive Power BI solutions such as dashboards, reports, and data models to meet business needs. You will lead the entire lifecycle of BI projects, from requirement gathering to deployment, ensuring optimal performance. Utilizing Microsoft Fabric, you will streamline data pipelines by integrating data engineering, data storage, and data processing capabilities. Integration of Power BI with Microsoft Fabric will be essential for improved performance, scalability, and efficiency. Your role will also involve working with Azure Data Services (e.g., Azure Data Lake, Azure Synapse, Azure Data Factory) to support the BI architecture. Establishing and implementing best practices in Power BI development, including DAX functions, data transformations, and data modeling, will be part of your responsibilities. Additionally, you will lead and mentor a team of Power BI developers, ensuring high-quality output and adherence to best practices. You will oversee task prioritization, resource allocation, and project timelines to ensure timely and successful delivery of BI solutions. Collaboration with data engineers and stakeholders to translate business requirements into functional, scalable BI solutions will be crucial. Driving BI initiatives to ensure alignment with business goals and objectives will also be a key aspect of your role. To qualify for this position, you should have a Bachelor's degree in Computer Science, Information Systems, Data Analytics, or a related field. You should have 10-15 years of experience in BI development, with at least 3 years in a leadership role. Proven experience with Power BI, Microsoft Fabric, and Azure Data Services will be essential for success in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
Genpact is a global professional services and solutions firm with a workforce of over 125,000 individuals in more than 30 countries. We are characterized by our innate curiosity, entrepreneurial agility, and commitment to creating enduring value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises globally, including the Fortune Global 500. We leverage our profound business and industry knowledge, digital operations services, and expertise in data, technology, and AI to deliver impactful outcomes. We are currently seeking applications for the position of Senior Principal Consultant - QA Engineer! Responsibilities: - Develop comprehensive test plans, test cases, and test scenarios based on functional and non-functional requirements. - Manage the test case life cycle efficiently. - Execute and analyze manual and automated tests to identify defects and ensure the quality of software applications. - Collaborate closely with development teams to align test cases with development goals and timelines. - Work with cross-functional teams to ensure adequate testing coverage and effective communication of test results. Moreover, the ideal candidate should possess the ability to manage repeatable standard processes while also demonstrating proficiency in identifying and resolving ad-hoc issues. Qualifications we seek in you! Minimum Qualifications: - Proficiency in SQL, ETL Testing, and writing testing scripts in Python to validate functionality, create automation frameworks, and ensure the performance and reliability of data systems. - In-depth understanding of the data domain, encompassing data processing, storage, and retrieval. - Strong collaboration, communication, and analytical skills. - Experience in reviewing system requirements and tracking quality assurance metrics such as defect densities and open defect counts. - Experience in creating and enhancing the integration of CI/CD pipelines. - Familiarity with Agile/Scrum development processes. - Some exposure to performance and security testing. - Hands-on experience in test execution using AWS services, particularly proficient in services like MKS, EKS, Redshift, and S3. If you are passionate about quality assurance engineering and possess the required qualifications, we invite you to apply for this exciting opportunity! Job Details: - Job Title: Senior Principal Consultant - Location: India-Gurugram - Schedule: Full-time - Education Level: Bachelor's / Graduation / Equivalent - Job Posting Date: Sep 18, 2024, 4:28:53 AM - Unposting Date: Oct 18, 2024, 1:29:00 PM - Master Skills List: Digital - Job Category: Full Time,
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
haryana
On-site
As a Data Engineer (Manager) specializing in web scraping with 5 to 10 years of experience, you will play a key role in designing, implementing, and maintaining automated systems to extract, process, and analyze data from various online sources. Your work will be crucial in gathering valuable insights to support business decisions and strategies. Your responsibilities will include leading and managing a team of data engineers focused on web scraping and data extraction, designing and maintaining scalable web scraping pipelines and ETL processes, collaborating with cross-functional teams to understand data requirements and deliver effective solutions, ensuring data quality, integrity, and security, optimizing web scraping workflows for performance and efficiency, evaluating and integrating new tools and technologies, developing and enforcing best practices for web scraping, and providing mentorship and professional development opportunities for team members. To excel in this role, you should have proficiency in web scraping tools and frameworks such as Scrapy, Beautiful Soup, and Selenium, strong programming skills in languages like Python, Java, or similar, experience with data storage solutions including SQL, NoSQL, and cloud databases, knowledge of APIs and data integration techniques, familiarity with big data technologies like Hadoop and Spark, leadership and team management skills, and excellent problem-solving and analytical abilities. Preferred qualifications include a Bachelor's or Master's degree in Computer Science, Data Engineering, or related fields, experience in handling large-scale data extraction projects, and knowledge of data governance and compliance regulations. If you are passionate about leveraging web scraping to drive data-driven decision-making, this role offers an exciting opportunity to lead a team of data engineers, design cutting-edge solutions, and contribute to the success of the business.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
The Finance & Accounting Management position at Aloft Bengaluru Outer Ring Road involves supporting the day-to-day execution of general ledger processes. You will assist clients in understanding these processes, performing accounting functions such as account balancing, ledger reconciliation, reporting, and resolving discrepancies. Your responsibilities will include coordinating and implementing accounting projects, conducting Accounting SOP audits, complying with fraud and collection laws, generating accurate reports, and analyzing information to solve problems effectively. To qualify for this role, you should hold a 4-year bachelor's degree in Finance and Accounting or a related major. If you have a 2-year degree in the same field, you must have at least 1 year of experience in finance and accounting or a related professional area. You will be expected to manage work, projects, and policies by coordinating accounting tasks, submitting reports on time, documenting profits and losses accurately, and ensuring compliance with tax regulations. Additionally, you will demonstrate and apply accounting knowledge by staying updated on relevant issues, systems, and processes, using computer systems proficiently, and making informed decisions based on laws and regulations. In this role, you will also be responsible for providing information to supervisors and co-workers, demonstrating personal integrity, utilizing effective listening skills, managing time efficiently, and presenting ideas clearly and concisely. You will be part of a diverse and inclusive workforce at Marriott International, where non-discrimination is practiced based on protected characteristics such as disability and veteran status. Aloft Hotels values connecting with guests and providing them with a unique experience in a modern and vibrant environment. If you are a confident individual who enjoys building connections with others, Aloft Hotels offers a dynamic work environment within the Marriott International brand where you can grow both personally and professionally.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The Laser Scanning Analysis and Support Engineer will be responsible for assisting laser scanning projects related to the repair of turbomachinery, analyzing point cloud data, and providing technical support to clients and internal teams. This role requires a strong understanding of laser scanning technology, data processing, and 3D modeling, with a specific focus on the outstanding requirements of turbomachinery applications. Process and analyze point cloud data using specialized software to build accurate 3D models of turbo compressors and associated components. Generate detailed reports and visualizations to support design, repair, and optimization efforts. Fix and resolve issues related to laser scanning equipment and software. Provide technical assistance and support to service centers regarding laser scanning projects, data interpretation, and the application of findings to turbomachinery repair. Work closely with project managers, engineers, and other collaborators to understand project requirements and work you're doing specific to turbomachinery. Prepare and present findings, recommendations, and technical reports to clients and internal teams, emphasizing the implications for turbo compressor performance and efficiency. Ensure the accuracy and quality of data collected and processed, adhering to industry standards and best practices in laser scanning and turbomachinery analysis. The ideal candidate should have a Bachelor's degree in Mechanical Engineering, Aerospace Engineering, or a related field with a focus on turbomachinery. Proven experience of at least 3 years in laser scanning, surveying, or 3D modeling, preferably in the context of turbomachinery or industrial applications. Proficiency in software (e.g., Polyworks, Spatial Analyzer, or similar) and CAD software (e.g., Siemens NX, Autocad). Strong analytical skills and attention to detail, particularly in interpreting complex geometries and engineering specifications. Excellent communication and social skills, with the ability to convey technical information to diverse audiences. Ability to work independently and collaboratively within a multidisciplinary team. Siemens Energy's Transformation of Industry division is decarbonizing the industrial sector by enabling the transition to balanced processes, building on a strong industrial customer base, a global network, diverse technologies, and coordinated execution capabilities. The successful candidate will play a crucial role in driving Siemens Energy's mission forward by contributing to the division's efforts in growing electrification and efficiency in the industrial sector. Siemens Energy is a global energy technology company with a commitment to developing sustainable and reliable energy systems for the future. With a distributed team of dedicated employees around the world, we are at the forefront of the energy transition, driving innovation and providing solutions that meet the growing energy demand of the global community. Join us in our mission to make sustainable, reliable, and affordable energy a reality through decarbonization, new technologies, and energy transformation. Siemens Energy values diversity and inclusion as key drivers of creativity and innovation. With employees from over 130 nationalities, we celebrate the unique contributions of individuals from diverse backgrounds, regardless of ethnicity, gender, age, religion, identity, or disability. Our inclusive culture empowers us to energize society and drive positive change through collaboration and respect for differences. Employee benefits at Siemens Energy include remote working arrangements, medical insurance coverage for employees and their families, and other perks such as meal cards as part of the compensation package. We are committed to providing a supportive and inclusive work environment that values the well-being and professional growth of our employees. For more information on how you can contribute to Siemens Energy's mission and be part of our diverse and innovative team, visit our website at https://www.siemens-energy.com/employeevideo. Join us in shaping the future of energy and making a positive impact on society through sustainable and transformative solutions.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
telangana
On-site
Support the property's technology Information Resources objectives including planning, decision-making, implementation, and maintenance while interacting with vendors, owners, and property staff. The ideal candidate will possess either a 2-year degree from an accredited university in Information Technology, Computer Science, or related major with 4 years of experience in Information Technology, Computer Science, or related professional area OR a 4-year bachelor's degree from an accredited university in Information Technology, Computer Science, or related major, along with being a certified trainer and having 2 years of experience in Information Technology, Computer Science, or related professional area. Key responsibilities include supporting client technology needs by utilizing computers and systems for functions, data entry, and information processing, monitoring and managing property-based systems, analyzing information for problem identification and proposing solutions, maintaining and repairing equipment, overseeing computer and network operations, and ensuring smooth administration functions. In addition, the role involves managing projects and policies to ensure compliance with laws, regulations, and standards, enforcing IR policies and standards to safeguard company hardware, software, and resources, and maintaining information systems and technology goals by developing specific plans, setting priorities, and allocating resources efficiently. The successful candidate will also demonstrate and apply IR knowledge by staying updated on technical advancements, showcasing expertise in job-relevant issues, products, systems, and processes, and providing technical support when needed. Other responsibilities include communicating with supervisors and co-workers via various channels, analyzing information to solve problems effectively, coordinating property efforts, managing vendors for IT requirements, and serving as an escalation point for problem resolution. Marriott International is an equal opportunity employer that values diversity and promotes an inclusive, people-first culture. Joining the Sheraton family means becoming part of a global community that has been connecting people since 1937. Sheraton associates strive to create a sense of belonging in over 400 communities worldwide by delivering engaging experiences and thoughtful service. If you are a team player eager to provide a meaningful guest experience, consider exploring career opportunities with Sheraton and be part of The World's Gathering Place mission. Joining Sheraton Hotels & Resorts offers you a chance to do your best work, be part of an amazing global team, and become the best version of yourself.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
jaipur, rajasthan
On-site
You should have experience in SQL, Database Design, Stored Procedures, Performance Tuning, etc. Extensive knowledge of data processing, relational database systems (Oracle and Oracle19C preferred) is necessary. Strong analytical skills are required to analyze existing databases and understand the client's needs for developing effective systems. Good communication skills, both oral and written, are essential. You must possess sound accuracy and logic to accurately develop databases from scratch in a logical manner. Understanding the concepts of other programming languages and front-end programming languages is also necessary. Troubleshooting issues along the CI/CD pipeline and hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools are important. Working experience with GIT/GITLAB/Bitbucket is required. Developing automation scripts for repetitive tasks like rebalancing partitions, auto deployment of code and dependencies is part of the role. Writing scripts to automate tests and validate data pipelines is crucial. Automating and streamlining software development and infrastructure management processes is also a key responsibility. Application Performance Monitoring and Logging are part of the duties. Understanding the organization's needs and applying solutions, tools, and standard methodologies from an immediate/dire need to a long-term perspective/vision is expected.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
delhi
On-site
As a part of our team at Darshana Singal Foundation, you will be responsible for coordinating with various center heads to effectively manage operations at our 2 skill development academies, which are part of our NGO initiative. Additionally, you will play a key role in data processing to support our programs and initiatives. Darshana Singal Foundation serves as the CSR wing of Eastman Auto and Power Limited, dedicated to three core projects including education, livelihood, and infrastructure development. Join us in our mission to create a positive impact in the community and contribute towards meaningful social change.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
indore, madhya pradesh
On-site
Golden Eagle IT Technologies Pvt. Ltd. is looking for a skilled Data Engineer with 2 to 4 years of experience to join the team in Indore. The ideal candidate should have a solid background in data engineering, big data technologies, and cloud platforms. As a Data Engineer, you will be responsible for designing, building, and maintaining efficient, scalable, and reliable data pipelines. You will be expected to develop and maintain ETL pipelines using tools like Apache Airflow, Spark, and Hadoop. Additionally, you will design and implement data solutions on AWS, leveraging services such as DynamoDB, Athena, Glue Data Catalog, and SageMaker. Working with messaging systems like Kafka for managing data streaming and real-time data processing will also be part of your responsibilities. Proficiency in Python and Scala for data processing, transformation, and automation is essential. Ensuring data quality and integrity across multiple sources and formats will be a key aspect of your role. Collaboration with data scientists, analysts, and other stakeholders to understand data needs and deliver solutions is crucial. Optimizing and tuning data systems for performance and scalability, as well as implementing best practices for data security and compliance, are also expected. Preferred skills include experience with infrastructure as code tools like Pulumi, familiarity with GraphQL for API development, and exposure to machine learning and data science workflows, particularly using SageMaker. Qualifications for this position include a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 2-4 years of experience in data engineering or a similar role. Proficiency in AWS cloud services and big data technologies, strong programming skills in Python and Scala, knowledge of data warehousing concepts and tools, as well as excellent problem-solving and communication skills are required.,
Posted 1 week ago
4.0 - 8.0 years
8 - 12 Lacs
Mumbai
Work from Office
Lead a high-performing technical team in delivering survey data analytics deliverables and intelligent automation solutions. Initially focused on client engagement and operational excellence, this role evolves into spearheading backend development and process innovation across the market research lifecycle. Job Description: Key Responsibilities Client & Stakeholder Engagement (Short-Term Focus) Act as the primary point of contact for key clients, translating research goals into technical deliverables. Ensure timely, accurate, and high-quality outputs aligned with client expectations and market research standards. Partner with research and project managers to ensure stakeholder feedback is embedded in deliverables. Team Leadership & Capability Development (Short-Term Focus) Guide and mentor a multidisciplinary team (SQL, R, Python, Tableau) in delivering data processing and reporting solutions. Lead sprint planning, resource allocation, and task optimization across concurrent client deliverables. Elevate team performance through structured review processes and personalized skill development plans. Technical Strategy & Innovation (Growing Long-Term Focus) Architect automation and data products to accelerate turnaround time and boost data integrity. Conceptualize and build modular backend components using Python, APIs, microservices, and containerized frameworks. Drive innovation by integrating LLM-based tools and AI models into existing workflows to enhance analytics and decision support. Collaborate cross-functionally to prototype, iterate, and refine full-stack solutions. Deliver internal training and documentation to democratize automation across the team. Required Skills & Qualifications Must-Have 7 10 years of experience in Market Research tech stack. Strong leadership with a track record of delivering end-to-end client projects. Deep understanding in Forsta Surveys, Qualtrics, SPSS, and data platforms. Advanced programming in SQL, Python, R, React, and VBA. Familiarity with Agile methodology and project management tools. Good to Have Experience integrating APIs and developing full-stack applications. Exposure to LLM-based apps (e. g. GPT integration). Understanding of DevOps practices, Git-based version control, and microservice architecture. Awareness of PMP or Scrum frameworks. Location: Coimbatore Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
2.0 - 5.0 years
6 - 9 Lacs
Mumbai
Work from Office
About this role What are Aladdin and Aladdin Engineering You will be working on BlackRocks investment operating system called Aladdin , which is used both internally within BlackRock and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading, and operations tools on a single platform. It powers informed decision-making and creates a connective tissue for thousands of users investing worldwide. Our development teams are part of Aladdin Engineering . We collaborate to build the next generation of technology that transforms the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and support millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users worldwide every day. Your Team: The Database Hosting Team is a key part of Platform Hosting Services , which operates under the broader Aladdin Engineering group. Hosting Services is responsible for managing the reliability, stability, and performance of the firms financial systems, including Aladdin, and ensuring its availability to our business partners and customers. We are a globally distributed team, spanning multiple regions, providing engineering and operational support for online transaction processing, data warehousing, data replication, and distributed data processing platforms. Your Role and Impact: Data is the backbone of any world-class financial institution. The Database Operations Team ensures the resiliency and integrity of that data while providing instantaneous access to a large global user base at BlackRock and across many institutional clients. As specialists in database technology, our team is involved in every aspect of system design, implementation, tuning, and monitoring, using a wide variety of industry-leading database technologies. We also develop code to provide analysis, insights, and automate our solutions at scale. Although our specialty is database technology, to excel in our role, we must understand the environment in which our technology operates. This includes understanding the business needs, application server stack, and interactions between database software, operating systems, and host hardware to deliver the best possible service. We are passionate about performance and innovation. At every level of the firm, we embrace diversity and offer flexibility to enhance work-life balance. Your Responsibilities: The role involves providing operations, development, and project support within the global database environment across various platforms. Key responsibilities include: Operational Support for Database Technology: Engineering, administration, and operations of OLTP, OLAP, data warehousing platforms, and distributed No-SQL systems. Collaboration with infrastructure teams, application developers, and business teams across time zones to deliver high-quality service to Aladdin users. Automation and development of database operational, monitoring, and maintenance toolsets to achieve scalability and efficiency. Database configuration management, capacity and scale management, schema releases, consistency, security, disaster recovery, and audit management. Managing operational incidents, conducting root-cause analysis, resolving critical issues, and mitigating future risks. Assessing issues for severity, troubleshooting proactively, and ensuring timely resolution of critical system issues. Escalating outages when necessary, collaborating with Client Technical Services and other teams, and coordinating with external vendors for support. Project-Based Participation: Involvement in major upgrades and migration/consolidation exercises. Exploring and implementing new product features. Contributing to performance tuning and engineering activities. Contributing to Our Software Toolset: Enhancing monitoring and maintenance utilities in Perl, Python, and Java. Contributing to data captures to enable deeper system analysis. Qualifications: B. E. /B. Tech/MCA or another relevant engineering degree from a reputable university. 4 + years of proven experience in Data Administration or a similar role. Skills and Experience: Enthusiasm for acquiring new technical skills. Effective communication with senior management from both IT and business areas. Understanding of large-scale enterprise application setups across data centers/cloud environments. Willingness to work weekends on DBA activities and shift hours. Experience with database platforms like SAP Sybase , Microsoft SQL Server , Apache Cassandra , Cosmos DB, PostgreSQL, and data warehouse platforms such as Snowflake , Greenplum. Exposure to public cloud platforms such as Microsoft Azure, AWS, and Google Cloud. Knowledge of programming languages like Python, Perl, Java, Go; automation tools such as Ansible/AWX; source control systems like GIT and Azure DevOps. Experience with operating systems like Linux and Windows. Strong background in supporting mission-critical applications and performing deep technical analysis. Flexibility to work with various technologies and write high-quality code. Exposure to project management. Passion for interactive troubleshooting, operational support, and innovation. Creativity and a drive to learn new technologies. Data-driven problem-solving skills and a desire to scale technology for future needs. Operating Systems: Familiarity with Linux/Windows. Proficiency with shell commands (grep, find, sed, awk, ls, cp, netstat, etc. ). Experience checking system performance metrics like CPU, memory, and disk usage on Unix/Linux. Other Personal Characteristics: Integrity and the highest ethical standards. Ability to quickly adjust to complex data and information, displaying strong learning agility. Self-starter with a commitment to superior performance. Natural curiosity and a desire to always learn. If this excites you, we would love to discuss your potential role on our team! Our benefits . Our hybrid work model BlackRock s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment the one we make in our employees. It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www. linkedin. com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 1 week ago
2.0 - 5.0 years
6 - 9 Lacs
Gurugram
Work from Office
About this role What are Aladdin and Aladdin Engineering You will be working on BlackRocks investment operating system called Aladdin , which is used both internally within BlackRock and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading, and operations tools on a single platform. It powers informed decision-making and creates a connective tissue for thousands of users investing worldwide. Our development teams are part of Aladdin Engineering . We collaborate to build the next generation of technology that transforms the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and support millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users worldwide every day. Your Team: The Database Hosting Team is a key part of Platform Hosting Services , which operates under the broader Aladdin Engineering group. Hosting Services is responsible for managing the reliability, stability, and performance of the firms financial systems, including Aladdin, and ensuring its availability to our business partners and customers. We are a globally distributed team, spanning multiple regions, providing engineering and operational support for online transaction processing, data warehousing, data replication, and distributed data processing platforms. Your Role and Impact: Data is the backbone of any world-class financial institution. The Database Operations Team ensures the resiliency and integrity of that data while providing instantaneous access to a large global user base at BlackRock and across many institutional clients. As specialists in database technology, our team is involved in every aspect of system design, implementation, tuning, and monitoring, using a wide variety of industry-leading database technologies. We also develop code to provide analysis, insights, and automate our solutions at scale. Although our specialty is database technology, to excel in our role, we must understand the environment in which our technology operates. This includes understanding the business needs, application server stack, and interactions between database software, operating systems, and host hardware to deliver the best possible service. We are passionate about performance and innovation. At every level of the firm, we embrace diversity and offer flexibility to enhance work-life balance. Your Responsibilities: The role involves providing operations, development, and project support within the global database environment across various platforms. Key responsibilities include: Operational Support for Database Technology: Engineering, administration, and operations of OLTP, OLAP, data warehousing platforms, and distributed No-SQL systems. Collaboration with infrastructure teams, application developers, and business teams across time zones to deliver high-quality service to Aladdin users. Automation and development of database operational, monitoring, and maintenance toolsets to achieve scalability and efficiency. Database configuration management, capacity and scale management, schema releases, consistency, security, disaster recovery, and audit management. Managing operational incidents, conducting root-cause analysis, resolving critical issues, and mitigating future risks. Assessing issues for severity, troubleshooting proactively, and ensuring timely resolution of critical system issues. Escalating outages when necessary, collaborating with Client Technical Services and other teams, and coordinating with external vendors for support. Project-Based Participation: Involvement in major upgrades and migration/consolidation exercises. Exploring and implementing new product features. Contributing to performance tuning and engineering activities. Contributing to Our Software Toolset: Enhancing monitoring and maintenance utilities in Perl, Python, and Java. Contributing to data captures to enable deeper system analysis. Qualifications: B. E. /B. Tech/MCA or another relevant engineering degree from a reputable university. 4 + years of proven experience in Data Administration or a similar role. Skills and Experience: Enthusiasm for acquiring new technical skills. Effective communication with senior management from both IT and business areas. Understanding of large-scale enterprise application setups across data centers/cloud environments. Willingness to work weekends on DBA activities and shift hours. Experience with database platforms like SAP Sybase , Microsoft SQL Server , Apache Cassandra , Cosmos DB, PostgreSQL, and data warehouse platforms such as Snowflake , Greenplum. Exposure to public cloud platforms such as Microsoft Azure, AWS, and Google Cloud. Knowledge of programming languages like Python, Perl, Java, Go; automation tools such as Ansible/AWX; source control systems like GIT and Azure DevOps. Experience with operating systems like Linux and Windows. Strong background in supporting mission-critical applications and performing deep technical analysis. Flexibility to work with various technologies and write high-quality code. Exposure to project management. Passion for interactive troubleshooting, operational support, and innovation. Creativity and a drive to learn new technologies. Data-driven problem-solving skills and a desire to scale technology for future needs. Operating Systems: Familiarity with Linux/Windows. Proficiency with shell commands (grep, find, sed, awk, ls, cp, netstat, etc. ). Experience checking system performance metrics like CPU, memory, and disk usage on Unix/Linux. Other Personal Characteristics: Integrity and the highest ethical standards. Ability to quickly adjust to complex data and information, displaying strong learning agility. Self-starter with a commitment to superior performance. Natural curiosity and a desire to always learn. If this excites you, we would love to discuss your potential role on our team! Our benefits . Our hybrid work model BlackRock s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment the one we make in our employees. It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www. linkedin. com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 1 week ago
7.0 - 12.0 years
16 - 17 Lacs
Chennai
Work from Office
. Responsible for planning and designing new software and web applications. Analyzes, tests and assists with the integration of new applications. Oversees the documentation of all development activity. Trains non-technical personnel. Assists with tracking performance metrics. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Core Responsibilities Required Skills Experiences : 8 years to 12 years Proficiency in Python or any mainstream backend programming language. Strong experience with databases and caching technologies (e. g. , Redis). Hands-on experience with ETL pipelines and batch data processing. Expertise in CI/CD tools and deployment automation (e. g. , Jenkins, Concourse). Familiarity with monitoring and logging tools (e. g. , Prometheus, Grafana, ELK, Kibana). Experience with container orchestration and related tools (e. g. , Kubernetes, Helm, Vector, Vault). Proven track record of building reliable, observable, and high-performance backend services. Experience optimizing and scaling services to handle global, high-traffic workloads. Experience with machine learning operations (MLOps), including deployment and monitoring of ML models in production. Experience managing A/B tests for ML models and collaborating with ML researchers. Collaborates with project stakeholders to identify product and technical requirements. Conducts analysis to determine integration needs. Designs new software and web applications, supports applications under development and customizes current applications. Develops software update process for existing applications. Assists in the roll-out of software releases. Trains junior Software Development Engineers on internally developed software applications. Oversees the researching, writing and editing of documentation and technical requirements, including evaluation plans, test results, technical manuals and formal recommendations and reports. Keeps current with technological developments within the industry. Monitors and evaluates competitive applications and products. Reviews literature, patents and current practices relevant to the solution of assigned projects. Provides technical leadership throughout the design process and guidance with regards to practices, procedures and techniques. Serves as a guide and mentor for junior level Software Development Engineers. Assists in tracking and evaluating performance metrics. Ensures team delivers software on time, to specification and within budget. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays expertise in knowledge of engineering methodologies, concepts and skills and their application in the area of specified engineering specialty. Displays expertise in process design and redesign skills. Presents and defends architectural, design and technical choices to internal audiences. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do whats right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. Thats why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the benefits summary on our careers site for more details. Education Bachelors Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Certifications (if applicable) Relevant Work Experience 7-10 Years Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.
Posted 1 week ago
8.0 - 13.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Data Engineer What you will do Let s do this. Let s change the world. In this vital role you will be responsible for "Run" and "Build" project portfolio execution, collaborate with business partners and other IS service leads to deliver IS capability and roadmap in support of business strategy and goals. Real world data analytics, visualization and advanced technology play a vital role in supporting Amgen s industry leading innovative Real World Evidence approaches. The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Collaborate and communicate effectively with product teams What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree / Bachelors degree and 8 to 13 years of experience in Computer Science, IT or related field Must-Have Skills: Hands on experience with bigdata technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on bigdata processing Hands on experience with various Python/R packages for EDA, feature engineering and machine learning model training Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data governance frameworks, tools, and standard processes. Knowledge of data protection regulations and compliance requirements (e. g. , GDPR, CCPA) Preferred Qualifications: Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, OMOP. Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Certified Data Scientist (preferred on Databricks or Cloud environments) Machine Learning Certification (preferred on Databricks or Cloud environments) SAFe for Teams certification (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
5.0 - 9.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Data Science Engineer What you will do Let s do this. Let s change the world. In this vital role We are seeking a highly skilled Machine Learning Engineer with a strong MLOps background to join our team. You will play a pivotal role in building and scaling our machine learning models from development to production. Your expertise in both machine learning and operations will be essential in creating efficient and reliable ML pipelines. Roles & Responsibilities: Collaborate with data scientists to develop, train, and evaluate machine learning models. Build and maintain MLOps pipelines, including data ingestion, feature engineering, model training, deployment, and monitoring. Leverage cloud platforms (AWS, GCP, Azure) for ML model development, training, and deployment. Implement DevOps/MLOps best practices to automate ML workflows and improve efficiency. Develop and implement monitoring systems to track model performance and identify issues. Conduct A/B testing and experimentation to optimize model performance. Work closely with data scientists, engineers, and product teams to deliver ML solutions. Stay updated with the latest trends and advancements What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree / Bachelors degree and 5 to 9 years [Job Code s Discipline and/or Sub-Discipline] Functional Skills: Must-Have Skills: Solid foundation in machine learning algorithms and techniques Experience in MLOps practices and tools (e. g. , MLflow, Kubeflow, Airflow); Experience in DevOps tools (e. g. , Docker, Kubernetes, CI/CD) Proficiency in Python and relevant ML libraries (e. g. , TensorFlow, PyTorch, Scikit-learn) Outstanding analytical and problem-solving skills; Ability to learn quickly; Good communication and interpersonal skills Good-to-Have Skills: Experience with big data technologies (e. g. , Spark, Hadoop), and performance tuning in query and data processing Experience with data engineering and pipeline development Experience in statistical techniques and hypothesis testing, experience with regression analysis, clustering and classification Knowledge of NLP techniques for text analysis and sentiment analysis Experience in analyzing time-series data for forecasting and trend analysis What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
5.0 - 9.0 years
16 - 18 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Data Engineer What you will do Let s do this. Let s change the world. In this vital role you will be responsible for "Run" and "Build" project portfolio execution, collaborate with business partners and other IS service leads to deliver IS capability and roadmap in support of business strategy and goals. Real world data analytics, visualization and advanced technology play a vital role in supporting Amgen s industry leading innovative Real World Evidence approaches. The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Collaborate and communicate effectively with product teams What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree / Bachelors degree and 5 to 9 years of experience in Computer Science, IT or related field Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Hands on experience with various Python/R packages for EDA, feature engineering and machine learning model training Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e. g. , GDPR, CCPA) Preferred Qualifications: Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, OMOP. Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Certified Data Scientist (preferred on Databricks or Cloud environments) Machine Learning Certification (preferred on Databricks or Cloud environments) SAFe for Teams certification (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
3.0 - 7.0 years
10 - 11 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of a Consultant Specialist In this role, you will: Be expected to define and contribute at a high-level to many aspects of our collaborative Agile development process: Big Data development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Spark, and MapReduce access for the new user Executes the review / acceptance / tuning and ensures environment availability 24x7 General operational excellence. This includes good troubleshooting skills, understanding of system s capacity and bottlenecks. Must Haves: Strong problem-solving skills and adaptability to a complex environment Providing technical support and design Hadoop Big Data platforms (preferably Cloudera distributions like Hive, Beeline, Spark, HDFS, Kafka, Yarn, Zookeeper etc. ), handle and identify possible failure scenario (Incident Management), respond to end users of Hadoop platform on data or application issues, report and monitor daily SLA that identifies vulnerabilities and opportunities for improvement. Hands-on experience with large scale Big Data environment builds, capacity planning, performance tuning and monitoring, including end-to-end Cloudera cluster installation. Handling Hadoop security activities using Apache Ranger, Knox, TLs, Kerberos and Encryption zone management. Expertise in software installation and configuration, orchestration, and automation with tools such as Jenkins/Ansible. Improve the current estate by incorporating the use of centralized S3 data storage (VAST) throughout the platform processing stack 5+ years experience in engineering solutions in a Big Data on-prem or cloud environment. Requirements Forward thinking, independent, creative, and self-sufficient; who can work with limited documentation, has exposure testing complex multi-tiered integrated applications. Ability to work with minimal supervision on own initiative and on multiple tasks simultaneously. Develop shell scripts, LINUX utilities & LINUX Commands within the Hadoop system management context. Experience in monitoring and diagnosing Apache Spark jobs, including performance tuning and optimization for large-scale data processing. Implement and manage CI/CD pipelines using Jenkins and Ansible to automate deployment processes and infrastructure provisioning. Collaborate with Spark processing designers to build more efficient data processing at a large/massive scale. Exposure to Agile Project methodology but also with exposure to other methodologies (such as Kanban) .
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
Bengaluru
Work from Office
Job Title Associate Data Engineer (Internship Program to Full-time Employee) Job Description For more than 80 years, Kaplan has been a trailblazer in education and professional advancement. We are a global company at the intersection of education and technology, focused on collaboration, innovation, and creativity to deliver a best in class educational experience and make Kaplan a great place to work. Our offices in India opened in Bengaluru in 2018. Since then, our team has fueled growth and innovation across the organization, impacting students worldwide. We are eager to grow and expand with skilled professionals like you who use their talent to build solutions, enable effective learning, and improve students lives. The future of education is here and we are eager to work alongside those who want to make a positive impact and inspire change in the world around them. The Associate Data Engineer at Kaplan North America (KNA) within the Analytics division will work with world class psychometricians, data scientists and business analysts to forever change the face of education. This role is a hands-on technical expert who will help implement an Enterprise Data Warehouse powered by AWS RA3 as a key feature of our Lake House architecture. The perfect candidate possesses strong technical knowledge in data engineering, data observability, Infrastructure automation, data ops methodology, systems architecture, and development. You should be expert at designing, implementing, and operating stable, scalable, low cost solutions to flow data from production systems into the data warehouse and into end-user facing applications. You should be able to work with business customers in a fast-paced environment understanding the business requirements and implementing data & reporting solutions. Above all you should be passionate about working with big data and someone who loves to bring datasets together to answer business questions and drive change Responsibilities You design, implement, and deploy data solutions. You solve difficult problems generating positive feedback. Build different types of data warehousing layers based on specific use cases Lead the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions involving a significant amount of work Build scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective Utilize expertise in SQL and have a strong understanding of ETL and data modeling Ensure the accuracy and availability of data to customers and understand how technical decisions can impact their business s analytics and reporting Be proficient in at least one scripting/programming language to handle large volume data processing. 30-day notification period preferred Requirements: In-depth knowledge of the AWS stack (RA3, Redshift, Lambda, Glue, SnS). Experience in data modeling, ETL development and data warehousing. Effective troubleshooting and problem-solving skills Strong customer focus, ownership, urgency and drive. Excellent verbal and written communication skills and the ability to work well in a team Preferred Qualification: Proficiency with Airflow, Tableau & SSRS #LI-NJ1 Location Bangalore, KA, India Additional Locations Employee Type Employee Job Functional Area Systems Administration/Engineering Business Unit 00091 Kaplan Higher ED At Kaplan, we recognize the importance of attracting and retaining top talent to drive our success in a competitive market. Our salary structure and compensation philosophy reflect the value we place on the experience, education, and skills that our employees bring to the organization, taking into consideration labor market trends and total rewards. All positions with Kaplan are paid at least $15 per hour or $31,200 per year for full-time positions. Additionally, certain positions are bonus or commission-eligible. And we have a comprehensive benefits package, learn more about our benefits here . Diversity & Inclusion Statement : Kaplan is committed to cultivating an inclusive workplace that values diversity, promotes equity, and integrates inclusivity into all aspects of our operations. We are an equal opportunity employer and all qualified applicants will receive consideration for employment regardless of age, race, creed, color, national origin, ancestry, marital status, sexual orientation, gender identity or expression, disability, veteran status, nationality, or sex. We believe that diversity strengthens our organization, fuels innovation, and improves our ability to serve our students, customers, and communities. Learn more about our culture here . Kaplan considers qualified applicants for employment even if applicants have an arrest or conviction in their background check records. Kaplan complies with related background check regulations, including but not limited to, the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. There are various positions where certain convictions may disqualify applicants, such as those positions requiring interaction with minors, financial records, or other sensitive and/or confidential information. Kaplan is a drug-free workplace and complies with applicable laws.
Posted 1 week ago
3.0 - 6.0 years
14 - 18 Lacs
Mumbai
Work from Office
Your Team Responsibilities Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference Market & other critical datapoints to various products of the firm, The platform, hosted on firmsdata centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7 With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress, To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure, The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers The position is based in Mumbai, India office, Your Key Responsibilities Build and maintain ETL pipelines for Snowflake, Manage Snowflake objects and data models, Integrate data from various sources, Optimize performance and query efficiency, Automate and schedule data workflows, Ensure data quality and reliability, Collaborate with cross-functional teams, Document processes and data flows, Your Skills And Experience That Will Help You Excel Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Databases Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Experience managing Snowflake databases, schemas, tables, and other objects Proficient in Snowflake SQL, including CTEs, window functions, and stored procedures Familiar with Snowflake performance tuning and cost optimization tools Skilled in building ETL/ELT pipelines using dbt, Airflow, or Python Able to work with various data sources including RDBMS, APIs, and cloud storage Understanding of incremental loads, error handling, and scheduling best practices Strong SQL skills and intermediate Python proficiency for data processing Familiar with Git for version control and collaboration Basic knowledge of Azure, or GCP cloud platforms Capable of integrating Snowflake with APIs and cloud-native services About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing, Flexible working arrangements, advanced technology, and collaborative workspaces, A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results, A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients, Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development, Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles, We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Womens Leadership Forum, At MSCI we are passionate about what we do, and we are inspired by our purpose to power better investment decisions Youll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry, MSCI is a leading provider of critical decision support tools and services for the global investment community With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process, MSCI Inc is an equal opportunity employer It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability Assistance@msci and indicate the specifics of the assistance needed Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries, To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes Please do not forward CVs/Resumes to any MSCI employee, location, or website MSCI is not responsible for any fees related to unsolicited CVs/Resumes, Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers Read our full note on careers msci Show
Posted 1 week ago
1.0 - 3.0 years
1 - 5 Lacs
Nagar
Work from Office
At Davies North America, we re at the forefront of innovation and excellence, blending cutting-edge technology with top-tier professional services. As a vital part of the global Davies Group, we help businesses navigate risk, optimize operations, and spearhead transformation in the insurance and regulated sectors. Were on the lookout for an Indexer to join our growing team. As an Indexer, you will organize and make accessible large volumes of documents by indexing to the correct category to facilitate quick and accurate retrieval of information.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France