Jobs
Interviews

18519 Tuning Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery, Microsoft SQL Server, GitHub, Google Cloud Data Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Roles & Responsibilities:- 1:Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 2:Proven track record of delivering data integration, data warehousing soln 3: Strong SQL And Hands-on (No FLEX) 4:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 5:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes 6:Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : Professional & Technical Skills: - 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Basis Administration Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Basis Administration. - Strong understanding of system configuration and performance tuning. - Experience with database management and optimization techniques. - Familiarity with application lifecycle management tools. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in SAP Basis Administration. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education

Posted 6 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Basis Administration Good to have skills : NA Minimum 2 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function smoothly and efficiently. You will also engage in problem-solving discussions and contribute to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Basis Administration. - Good To Have Skills: Experience with SAP HANA and SAP NetWeaver. - Strong understanding of system performance tuning and optimization. - Experience in managing user access and security configurations. - Familiarity with backup and recovery processes in SAP environments. Additional Information: - The candidate should have minimum 2 years of experience in SAP Basis Administration. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to enhance application performance and user experience. Roles & Responsibilities: The Offshore Data Engineer plays a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support business intelligence, analytics, and machine learning initiatives. Working closely with onshore data architects and analysts, this role ensures high data quality, performance, and reliability across distributed systems. The engineer is expected to demonstrate technical proficiency, proactive problem-solving, and strong collaboration in a remote environment. -Design and develop robust ETL/ELT pipelines to ingest, transform, and load data from diverse sources. -Collaborate with onshore teams to understand business requirements and translate them into scalable data solutions. -Optimize data workflows through automation, parallel processing, and performance tuning. -Maintain and enhance data infrastructure including data lakes, data warehouses, and cloud platforms (AWS, Azure, GCP). -Ensure data integrity and consistency through validation, monitoring, and exception handling. -Contribute to data modeling efforts for both transactional and analytical use cases. -Deliver clean, well-documented datasets for reporting, analytics, and machine learning. -Proactively identify opportunities for cost optimization, governance, and process automation. Professional & Technical Skills: - Programming & Scripting: Proficiency in Databricks with SQL and Python for data manipulation and pipeline development. - Big Data Technologies: Experience with Spark, Hadoop, or similar distributed processing frameworks. -Workflow Orchestration: Hands-on experience with Airflow or equivalent scheduling tools. -Cloud Platforms: Strong working knowledge of cloud-native services (AWS Glue, Azure Data Factory, GCP Dataflow). -Data Modeling: Ability to design normalized and denormalized schemas for various use cases. -ETL/ELT Development: Proven experience in building scalable and maintainable data pipelines. -Monitoring & Validation: Familiarity with data quality frameworks and exception handling mechanisms. Good To have Skills -DevOps & CI/CD: Exposure to containerization (Docker), version control (Git), and deployment pipelines. -Data Governance: Understanding of metadata management, lineage tracking, and compliance standards. -Visualization Tools: Basic knowledge of BI tools like Power BI, Tableau, or Looker. -Machine Learning Support: Experience preparing datasets for ML models and feature engineering. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education

Posted 6 days ago

Apply

15.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Syniti ADM for SAP Good to have skills : NA Educational Qualification : 15 years full time education is required. Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Syniti ADM for SAP. - Strong understanding of application development methodologies. - Experience with integration techniques and data migration processes. - Ability to troubleshoot and resolve application issues effectively. - Familiarity with performance tuning and optimization strategies. Additional Information: - The candidate should have minimum 7.5 years of experience in Syniti ADM for SAP. - This position is based at our Ahmedabad office. - A 15 years full time education is required.

Posted 6 days ago

Apply

200.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Data Science (GenAI & Prompt engineering) – Bangalore Business Analytics Analyst 2 About CITI Citi's mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. We have 200 years of experience helping our clients meet the world's toughest challenges and embrace its greatest opportunities. Analytics and Information Management (AIM) Citi AIM was established in 2003, and is located across multiple cities in India – Bengaluru, Chennai, Pune and Mumbai. It is a global community that objectively connects and analyzes information, to create actionable intelligence for our business leaders. It identifies fact-based opportunities for revenue growth in partnership with the businesses. The function balances customer needs, business strategy, and profit objectives using best in class and relevant analytic methodologies. What do we do? The North America Consumer Bank – Data Science and Modeling team analyzes millions of prospects and billions of customer level transactions using big data tools and machine learning, AI techniques to unlock opportunities for our clients in meeting their financial needs and create economic value for the bank. The team extracts relevant insights, identifies business opportunities, converts business problems into modeling framework, uses big data tools, latest deep learning and machine learning algorithms to build predictive models, implements solutions and designs go-to-market strategies for a huge variety of business problems. Role Description The role will be Business Analytics Analyst 2 in the Data Science and Modeling of North America Consumer Bank team The role will report to the AVP / VP leading the team What do we offer: The Next Gen Analytics (NGA) team is a part of the Analytics & Information Management (AIM) unit. The NGA modeling team will focus on the following areas of work: Role Expectations: Client Obsession – Create client centric analytic solution to business problems. Individual should be able to have a holistic view of multiple businesses and develop analytic solutions accordingly. Analytic Project Execution – Own and deliver multiple and complex analytic projects. This would require an understanding of business context, conversion of business problems in modeling, and implementing such solutions to create economic value. Domain expert – Individuals are expected to be domain expert in their sub field, as well as have a holistic view of other business lines to create better solutions. Key fields of focus are new customer acquisition, existing customer management, customer retention, product development, pricing and payment optimization and digital journey. Modeling and Tech Savvy – Always up to date with the latest use cases of modeling community, machine learning and deep learning algorithms and share knowledge within the team. Statistical mind set – Proficiency in basic statistics, hypothesis testing, segmentation and predictive modeling. Communication skills – Ability to translate and articulate technical thoughts and ideas to a larger audience including influencing skills with peers and senior management. Strong project management skills. Ability to coach and mentor juniors. Contribute to organizational initiatives in wide ranging areas including competency development, training, organizational building activities etc. Role Responsibilities: Work with large and complex datasets using a variety of tools (Python, PySpark, SQL, Hive, etc.) and frameworks to build Deep learning/generative AI solutions for various business requirements. Primary focus areas include model training/fine-tuning, model validation, model deployment, and model governance related to multiple portfolios. Design, fine-tune and implement LLMs/GenAI applications using techniques like prompt engineering, Retrieval Augmented Generation (RAG) and model fine-tuning Responsible for documenting data requirements, data collection/processing/cleaning, and exploratory data analysis, including utilizing deep learning /generative AI algorithms and, data visualization techniques. Incumbents in this role may often be referred to as Data Scientists. Specialization in marketing, risk, digital, and AML fields possible, applying Deep learning & generative AI models to innovate in these domains. Collaborate with team members and business partners to build model-driven solutions using cutting-edge Generative AI models (e.g., Large Language Models) and also at times, ML/traditional methods (XGBoost, Linear, Logistic, Segmentation, etc.) Work with model governance & fair lending teams to ensure compliance of models in accordance with Citi standards. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules, and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. What do we look for: If you are a bright and talented individual looking for a career in AI and Machine Learning with a focus on Generative AI , Citi has amazing opportunities for you. Bachelor’s Degree with atleast 3 years of experience in data analytics, or Master’s Degree with 2 years of experience in data analytics, or PhD. Technical Skills Hands-on experience in PySpark/Python/R programing along with strong experience in SQL. 2-4 years of experience working on deep learning, and generative AI applications Experience working on Transformers/ LLMs (OpenAI, Claude, Gemini etc.,), Prompt engineering, RAG based architectures and relevant tools/frameworks such as TensorFlow, PyTorch, Hugging Face Transformers, LangChain, LlamaIndex etc., Solid understanding of deep learning, transformers/language models. Familiarity with vector databases and fine-tuning techniques Experience working with large and multiple datasets, data warehouses and ability to pull data using relevant programs and coding. Strong background in Statistical Analysis. Capability to validate/maintain deployed models in production Self-motivated and able to implement innovative solutions at fast pace Experience in Credit Cards and Retail Banking is preferred Competencies Strong communication skills Multiple stake holder management Strong analytical and problem solving skills Excellent written and oral communication skills Strong team player Control orientated and Risk awareness Working experience in a quantitative field Willing to learn and can-do attitude Ability to build partnerships with cross-function leaders Education: Bachelor's / master’s degree in economics / Statistics / Mathematics / Information Technology / Computer Applications / Engineering etc. from a premier institute Other Details Employment: Full Time Industry: Credit Cards, Retail Banking, Financial Services, Banking ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Specialized Analytics (Data Science/Computational Statistics) ------------------------------------------------------ Time Type: ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 6 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

About the Company Creospan is a growing tech collective of makers, shakers, and problem solvers, offering solutions today that will propel businesses into a better tomorrow. “Tomorrow’s ideas, built today!” In addition to being able to work alongside equally brilliant and motivated developers, our consultants appreciate the opportunity to learn and apply new skills and methodologies to different clients and industries. Job Title: Data Modeler Location: Pune (Pan India relocation is considerable - High preference is Pune) Hybrid: 3 days WFO & 2 days WFH Shift timings: UK Working Hours (9AM — 5PM GMT) Notice period: Immediate Gap: Upto 3 Months (Strictly not more than that) Project Overview: Creation and management of business data models in all their forms, including conceptual models, logical data models and physical data models (relational database designs, message models and others). Expert level understanding of relational database concepts, dimensional database concepts and database architecture and design, ontology and taxonomy design. Background working with key data domains as account, holding and transactions within security servicing or asset management space. Expertise in designing data driven solution on Snowflake for complex business needs. Knowledge of entire application lifecycle including Design, Development, Deployment, Operation and Maintenance in an Agile and DevOps culture. Role: This person strengthens the impact of, and provides recommendations on data-models and architecture that will need to be available and shared consistently across the TA organization through the identification, definition and analysis of how data related assets aid business outcomes. The Data Modeler\Architect is responsible for making data trusted, understood and easy to use. They will be responsible for the entire lifecycle of the data architectural assets, from design and development to deployment, operation and maintenance, with a focus on automation and quality. Must Have Skills: 10+ years of experience in Enterprise-level Data Architecture, Data Modelling, and Database Engineering Expertise in OLAP & OLTP design, Data Warehouse solutions, ELT/ETL processes Proficiency in data modelling concepts and practices such as normalization, denormalization, and dimensional modelling (Star Schema, Snowflake Schema, Data Vault, Medallion Data Lake) Experience with Snowflake-specific features, including clustering, partitioning, and schema design best practices Proficiency in Enterprise Modelling tools - Erwin, PowerDesigner, IBM Infosphere etc. Strong experience in Microsoft Azure data pipelines (Data Factory, Synapse, SQL DB, Cosmos DB, Databricks) Familiarity with Snowflake’s native tools and services including Snowflake Data Sharing, Snowflake Streams & Tasks, and Snowflake Secure Data Sharing Strong knowledge of SQL performance tuning, query optimization, and indexing strategies Strong verbal and written communication skills for collaborating with both technical teams and business stakeholders Working knowledge of BIAN, ACORD, ESG risk data integration Nice to Haves: At least 3+ in security servicing or asset Management/investment experience is highly desired Understanding of software development life cycle including planning, development, quality assurance, change management and release management Strong problem-solving skills and ability to troubleshoot complex issues Excellent communication and collaboration skills to work effectively in a team environment Self-motivated and ability to work independently with minimal supervision Excellent communication skills: experience in communicating with tech and non-tech teams Deep understanding of data and information architecture, especially in asset management space Familiarity with MDM, data vault, and data warehouse design and implementation techniques Business domain, data/content and process understanding (which are more important than technical skills). Being techno functional is a plus Good presentation skills in creating Data Architecture diagrams Data modelling and information classification expertise at the project and enterprise level Understanding of common information architecture frameworks and information models Experience with distributed data and analytics platforms in cloud and hybrid environments. Also an understanding of a variety of data access and analytic approaches (for example, microservices and event-based architectures) Knowledge of problem analysis, structured analysis and design, and programming techniques Python, R

Posted 6 days ago

Apply

135.0 years

0 Lacs

Greater Chennai Area

Remote

About FLSmidth : For more than 135 years, FLSmidth has challenged conventions and explored opportunities. Across more than 50 countries, we are 12,000 employees who combine our unique process knowledge on projects, products and services to drive success. We are the market-leading supplier of engineering, equipment and service solutions to customers in the global mining and cement industries. About the Role : You will be the part of FLSmidth’ s Process Optimisation Team that use FLSmidth’ s ECS/ProcessExpert® APC Toolbox [available with advanced controlling techniques like Model predictive control (MPC), Fuzzy Logic and Soft Sensors] to create online applications to improve the customer’s manufacturing processes. This involves extensive travelling and involves high level of technical expertise and customer handling skills. Qualification: Any Degree in Instrumentation/Chemical/Electronics Engineering. Experience: 2 - 8 years Job Responsibility : You will be involved in conducting process analysis to estimate potential of improvement through ECS/ProcessExpert® APC Toolbox implementation. You will be responsible for Designing, developing, testing and deploying APC Solution as per identified requirement of various equipment-based process applications. You will be responsible for establishing, identifying, and implementing process control strategies/philosophies as required. You will be involved in Establishing communication between existing control system and ECS/ProcessExpert® APC Toolbox system through OPC (DA/UA) or directly through PLC interface. You will be involved in Conducting step test and model identification. You will be involved in Fine tuning of controllers in MPC/Fuzzy/PID and implementing innovative technical ideas with process knowledge to achieve performance guarantee to increase productivity and decrease fuel and power consumption. You will be involved in providing training to customer and prepare and provide site specific manual to ensure continuous operation of ECS/ProcessExpert® APC Toolbox. You will be involved in carrying out analytical control measures and analysis to establish improvement in efficiency of plant operation after implementation of ECS/ProcessExpert® APC Toolbox and preparing detailed report and case story. You will be responsible for maintaining professional customer relationship and providing remote support after commissioning to maintain the healthiness of ECS/ProcessExpert® APC Toolbox You will be involved in evaluating new developments in advanced process control technology and interacting with development team for software improvements. You will cover all aspects of the technical implementation in APC project including supporting hardware purchase, software installation, systems integration, engineering, commissioning, coordination with PM, managing customer needs and complete deliverables on schedule. Your role is predominantly a project operations role; however, you may work across the full project lifecycle, including: Sales Support Solution Development Project Execution Post-Implementation Analysis It Involves visit to sites for the implementation purposes. Required knowledge/competencies: Some manufacturing experience in the process industries. Knowledge of cement plant and minerals process will be an added advantage. Strong interest in automation, process control and advanced model- predictive applications. Experience with industrial computing, networking, OPC and ODBC/SQL data communications. Experience with DCS and/or PLC configuration, especially in the context of integration with external applications. Exposure to advanced process control implementation. Must be willing and able to travel between 40% to 70% of the time. International assignments are likely. Self-motivated with excellent interpersonal, communication and organizational skills. Knowledge of Python, Machine learning algorithms and deep learning will be an added advantage. Should be a team player.

Posted 6 days ago

Apply

20.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Who is Forcepoint? Forcepoint simplifies security for global businesses and governments. Forcepoint’s all-in-one, truly cloud-native platform makes it easy to adopt Zero Trust and prevent the theft or loss of sensitive data and intellectual property no matter where people are working. 20+ years in business. 2.7k employees. 150 countries. 11k+ customers. 300+ patents. If our mission excites you, you’re in the right place; we want you to bring your own energy to help us create a safer world. All we’re missing is you! What You Will Be Doing Helping drive innovation through improvements to the CI/CD pipeline Appropriately and efficiently automating through tools such as Terraform, Ansible and Python Supporting teams who use our tools such as Engineering, Professional Services and Sales Engineering Running our infrastructure as code via git and webhooks Building a strong DevOps culture while also fostering strong collaboration with all areas of development, product, and QA Documenting all the things! Mentoring junior team members and foster a collaborative, process-mature team Being passionate about championing new tools, processes and helping adoption across our organization DevSecOps best practices championed across the team and organization Responsibilities Include Develop and support automated, scalable solutions to deploy and manage our global infrastructure. Implement effective CI/CD processes and pipelines. Build integrations between services to create fully automated processes. Maintaining and improving the functionality of automation tools for infrastructure provisioning, configuration, and deployment. Identify opportunities to optimize current solutions and perform hands-on troubleshooting of problems related systems and performance in Prod/QA/Dev environments. Preferred Skills BS in Computer Science or similar degree and 5+ years of related experience, or equivalent work experience Experience working with automated server configuration and deployment tools Strong working knowledge of AWS, certifications preferred Proficiency working in Linux environments, particularly with customer-facing systems Knowledge of IP networking, VPNs, DNS, load balancing, and firewalling. Strong working knowledge of Infrastructure as Code (Terraform /Cloud Formation) Nginx, Apache, HAproxy Object oriented programing experience, language such as Java, Python, C++ Deployment and support of containers such as Docker, Kubernetes GitHub, Jenkins, Artifactory Minimum Qualifications Strong practical Linux based systems administration skills in a Cloud or Virtualized environment. Scripting (BASH/ Python) Automation (Ansible/Chef/Puppet) CI/CD Tools, primarily Jenkins Familiarity with Continuous Integration and development pipeline processes. AWS, Google Cloud Compute, Azure (at least one of) Prior success in automating a real-world production environment. Experience in implementing monitoring tools and fine-tuning the metrics for optimal monitoring. Excellent written and oral communication skills; Ability to communicate effectively with technical and non-technical staff. Don’t meet every single qualification? Studies show people are hesitant to apply if they don’t meet all requirements listed in a job posting. Forcepoint is focused on building an inclusive and diverse workplace – so if there is something slightly different about your previous experience, but it otherwise aligns and you’re excited about this role, we encourage you to apply. You could be a great candidate for this or other roles on our team. The policy of Forcepoint is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to affirmatively seek to advance the principles of equal employment opportunity. Forcepoint is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by sending an email to recruiting@forcepoint.com. Applicants must have the right to work in the location to which you have applied.

Posted 6 days ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery, Microsoft SQL Server, GitHub, Google Cloud Data Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Roles & Responsibilities:- 1:Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 2:Proven track record of delivering data integration, data warehousing soln 3: Strong SQL And Hands-on (No FLEX) 4:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 5:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes 6:Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : Professional & Technical Skills: - 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor, 15 years full time education

Posted 6 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Project Role : Database Administrator Project Role Description : Design, implement and maintain databases. Install database management systems (DMBS). Develop procedures for day-to-day maintenance and problem resolution. Must have skills : Microsoft SQL Server Administration Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Database Administrator, you will be responsible for administering, developing, testing, and demonstrating databases. A typical day involves collaborating with various teams to design, implement, and maintain new databases, ensuring effective backup and recovery processes, and managing configuration settings. You will also install database management systems and contribute to the refinement of procedures and documentation for problem resolution and daily maintenance tasks, ensuring optimal database performance and reliability. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of database solutions that meet client needs. - Monitor database performance and troubleshoot issues to ensure high availability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft SQL Server Administration. - Strong understanding of database design principles and best practices. - Experience with backup and recovery strategies for database systems. - Familiarity with performance tuning and optimization techniques. - Knowledge of security measures and compliance standards for database management. Additional Information: - The candidate should have minimum 3 years of experience in Microsoft SQL Server Administration. - This position is based at our Noida office. - A 15 years full time education is required., 15 years full time education

Posted 6 days ago

Apply

2.0 years

0 Lacs

Rajkot, Gujarat, India

On-site

Setting up LAMP environment and hosting and configuring php projects. Troubleshooting network, processes, disk related issues. Installation and configuration of Samba and NFS. Setting up routers/modems and linux firewalls (iptables). Configuring MySql replication and clusters. Installation and configuration of PXE server for installing Linux over the Network. Technical writing skills producing clear and unambiguous technical documentation and user stories. Writing shell script to automate the tasks. Developing spiders using Scrapy framework (python) for crawling and scrapping the web. Troubleshooting issues of php scripts and supporting web developers to develop websites. Experience of performance tuning on apache server. Experience of testing server configuration and website using different testing tools. i.e. ab and siege. Position : 01 Required Experience : 6 months 2 years Technical Skills : Installation and configuration of Linux servers, troubleshooting network, processes, disk related issues. Apply Now

Posted 6 days ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 2-4 Years of Software development experience- Analyzing and extracting data from Oracle databases using SQL and PL/SQL scripts. Support the development and execution of data migration jobs from Oracle to Snowflake. Write and optimize SQL queries for data transformation, cleansing, and validation. Work with ETL tools such as Apache Spark, Python, Abinitio. Collaborate with senior developers to ensure data quality, integrity, and reconciliation. Participate in testing phases, including data validation and post-migration QA. Document data mapping, transformation logic, and workflows. Support performance tuning and troubleshooting of migration jobs. Participate in code reviews and follow established development practices and Support application deployments using CI/CD tools Contribute to documentation and system specifications Collaborate in Agile rituals, code reviews, and team demos Test Driven development and automated testing tools like JUnit, Cucumber/ Jasmine; JIRA, Gradle, Sonar. Working with Cloud platforms for deployment and AI based engineering tools. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 6 days ago

Apply

6.0 years

0 Lacs

India

Remote

Remote in Bangalore, India or Hyderabad, India Salary Range: ₹4.21 - 20 LPA Job Description Insight Global is looking for a Full Stack Developer to work on-site in Bangalore, India to work for a global technology leader known for driving innovation in networking, cybersecurity, and cloud solutions. This company has a strong presence in India and recognized for its cutting-edge technology, employee development and inclusive culture. You'll be part of a team that plays a critical role in the enablement of their product compliance teams. Key responsibilities include: - Work across the stack, utilizing React for front-end development and Node.js, Python, and Go for back-end development. - Design and implement efficient, scalable database schemas in PostgreSQL. - Develop dynamic and interactive dashboards using modern front-end frameworks (React + Material UI) and integrate data-driven features from back-end systems. - Implement automation solutions by integrating AI/ML models for performance tuning, data processing, and workflow automation. - Manage code versions and ensure collaborative coding practices using Git. - Troubleshoot and resolve issues with both cloud infrastructure and application code. Optimize code for performance, scalability, and maintainability. - Work closely with front-end and back-end developers, as well as data scientists and UX/UI designers to create seamless user experiences. - Create detailed technical documentation and runbooks for deployment, monitoring, and troubleshooting procedures. Required Skills & Experience - 4–6 years of experience in cloud engineering, application development, and automation. - Proficiency in JavaScript (React, Node.js), Python, Go, and familiarity with PostgreSQL. - Strong experience with Material UI, CSS, and creating intuitive, user-friendly dashboards. - Hands-on experience with cloud environments (AWS, Azure). - Experience integrating AI/ML models into cloud-based applications for automation purposes. - Strong experience with Git, versioning tools, and collaboration in a multi-developer environment. - Ability to troubleshoot complex issues in both cloud infrastructure and software applications.

Posted 6 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Position Overview- We are looking for an experienced AI Engineer to design, build, and optimize AI-powered applications, leveraging both traditional machine learning and large language models (LLMs). The ideal candidate will have a strong foundation in LLM fine-tuning, inference optimization, backend development, and MLOps, with the ability to deploy scalable AI systems in production environments. ShyftLabs is a leading data and AI company, helping enterprises unlock value through AI-driven products and solutions. We specialize in data platforms, machine learning models, and AI-powered automation, offering consulting, prototyping, solution delivery, and platform scaling. Our Fortune 500 clients rely on us to transform their data into actionable insights. Key Responsibilities: Design and implement traditional ML and LLM-based systems and applications Optimize model inference for performance and cost-efficiency Fine-tune foundation models using methods like LoRA, QLoRA, and adapter layers Develop and apply prompt engineering strategies including few-shot learning, chain-of-thought, and RAG Build robust backend infrastructure to support AI-driven applications Implement and manage MLOps pipelines for full AI lifecycle management Design systems for continuous monitoring and evaluation of ML and LLM models Create automated testing frameworks to ensure model quality and performance Basic Qualifications: Bachelor’s degree in Computer Science, AI, Data Science, or a related field 4+ years of experience in AI/ML engineering, software development, or data-driven solutions LLM Expertise Experience with parameter-efficient fine-tuning (LoRA, QLoRA, adapter layers) Understanding of inference optimization techniques: quantization, pruning, caching, and serving Skilled in prompt engineering and design, including RAG techniques Familiarity with AI evaluation frameworks and metrics Experience designing automated evaluation and continuous monitoring systems Backend Engineering Strong proficiency in Python and frameworks like FastAPI or Flask Experience building RESTful APIs and real-time systems Knowledge of vector databases and traditional databases Hands-on experience with cloud platforms (AWS, GCP, Azure) focusing on ML services MLOps & Infrastructure Familiarity with model serving tools (vLLM, SGLang, TensorRT) Experience with Docker and Kubernetes for deploying ML workloads Ability to build monitoring systems for performance tracking and alerting Experience building evaluation systems using custom metrics and benchmarks Proficient in CI/CD and automated deployment pipelines Experience with orchestration tools like Airflow Hands-on experience with LLM frameworks (Transformers, LangChain, LlamaIndex) Familiarity with LLM-specific monitoring tools and general ML monitoring systems Experience with distributed training and inference on multi-GPU environments Knowledge of model compression techniques like distillation and quantization Experience deploying models for high-throughput, low-latency production use Research background or strong awareness of the latest developments in LLMs Tools & Technologies We Use Frameworks: PyTorch, TensorFlow, Hugging Face Transformers Serving: vLLM, TensorRT-LLM, SGlang, OpenAI API Infrastructure: Docker, Kubernetes, AWS, GCP Databases: PostgreSQL, Redis, Vector Databases We are proud to offer a competitive salary alongside a strong healthcare insurance and benefits package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

🔍 We're Hiring: Mechatronics Engineer – Full-Time 📍 Location: Hyderabad (On-Site Only) | 🗓 Immediate Joiners Preferred At GenXReality , we're building India’s most affordable, high-performance VR headset — and we need a hands-on Mechatronics Engineer who can bring together mechanical, electronics, and software systems into one seamless product. 🔧 Role Overview As a Mechatronics Engineer , you will lead the integration of all physical components in our VR headset — from actuator mechanisms to sensors and embedded logic — ensuring perfect alignment between hardware functionality and user experience . 🛠️ Responsibilities Integrate mechanical structures , electronic circuits , and embedded software into one unified system Design and prototype head tracking modules, sensor assemblies, and electromechanical enclosures Work on precise actuator control, motor tuning, and haptic systems Collaborate with PCB, software, and product design teams for seamless cross-domain functionality Test and validate complete subsystems including power, motion, and control flows Optimize the headset’s form factor, alignment, durability, and thermal management Document system architecture, integration guides, and testing protocols ✅ What We're Looking For 2+ years of experience working in mechatronics or robotics systems Proven ability to integrate mechanical, electronics, and embedded software systems Experience with sensors (IMU, proximity), actuators, drivers, and microcontrollers (STM32, Arduino, or similar) Proficient in CAD tools (Fusion 360/SolidWorks) and basic PCB or breadboard-level electronics Working knowledge of C/C++ or Python for embedded device control (Bonus) Experience building wearables, VR/AR hardware, or compact motion systems Bachelor’s in Mechatronics, Robotics, Mechanical, ECE, or related field 💼 Position Details Role Type: Full-Time (On-Site) Location: Chengicherla, Hyderabad Relocation Support: PG assistance + travel reimbursement for out-of-state hires Working Hours: 9:30 AM – 5:00 PM, Monday to Saturday

Posted 6 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. At AHEAD, we prioritize creating a culture of belonging, where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer, and do not discriminate based on an individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, marital status, or any other protected characteristic under applicable law, whether actual or perceived. We embrace all candidates that will contribute to the diversification and enrichment of ideas and perspectives at AHEAD. SOC Analysts at AHEAD monitor customer environments and perform Incident Detection, Validation, and Incident Reporting. SOC Analysts are the frontline of SOC and are customer-facing representatives. SOC Analysts are responsible for triaging events, incidents, and reporting validated incidents to the customer for incident response. Incumbents will possess strong technical and analytical skills while providing accurate analysis of security related problems. They have a well-rounded networking background and are responsible for performing troubleshooting of customer issues. This individual is user focused and works to resolve client needs in a timely manner. These needs may involve resolving hardware/software failures, investigating, and responding to security threats, and making change request to the security policy of company devices. The SOC Analyst is expected to monitor security feeds streaming from client servers, network devices, and end user workstations, operate and maintain network security equipment at client locations. The Analyst is expected to be familiar with a wide range of security tools and understand basic security fundamentals. The Analyst will perform information security event analysis and must possess knowledge of operating systems, TCP/IP networking, network attacks, attack signatures, defense countermeasures, vulnerability management, and log analysis. Roles & Responsibilities Monitor and analyze network traffic and alerts Investigate intrusion attempts and perform in-depth analysis of exploits Provide network intrusion detection expertise to support timely and effective decision making of when to declare an incident Conduct proactive threat research Review security events that are populated in a Security Information and Event Management (SIEM) system Tuning of rules, filters, and policies for detection-related security technologies to improve accuracy and visibility Data mining of log sources to uncover and investigate anomalous activity, along with related items of interest Independently follow procedures to contain, analyze, and eradicate malicious activity Document all activities during an incident and provide leadership with status updates during the life cycle of the incident Incident management, response, and reporting Provide information regarding intrusion events, security incidents, and other threat indications and warning information to the client Track trends, statistics, and key figures for each assigned client Assist with the development of processes and procedures to improve incident response times, analysis of incident, and overall SOC functions Reporting Incident reports Security status reports Client-facing security meetings Position Requirements Incident handling/response experience Experience with Automation tools. Working knowledge of common operating systems (Windows, Linux, etc.) and basic endpoint security principles Understanding of and a strong desire to learn common security technologies (IDS, Firewall, SIEM, etc.) The ability to think creatively to find elegant solutions to complex problems Excellent verbal and written communication skills The desire to work both independently and collaboratively with a larger team A willingness to be challenged along with a strong appetite for learning 8-10 years of experience in Information Security, Incident Response, etc. (or related field) Hands-on experience with common security technologies (IDS, Firewall, SIEM, etc.) Knowledge of common security analysis tools & techniques Understanding of common security threats, attack vectors, vulnerabilities and exploits Knowledge of regular expressions Education Bachelors Degree in Computer Science, Information Security or related/equivalent educational or work experience One or more of the following certifications: CISSP, GCIA, Security+, CEH, ACSE Why AHEAD Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits Include Medical, Dental, and Vision Insurance 401(k) Paid company holidays Paid time off Paid parental and caregiver leave Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (“OTE”) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidate’s relevant experience, qualifications, and geographic location.

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. [Data Engineer] What You Will Do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Ø Design, develop, and maintain data solutions for data generation, collection, and processing Ø Be a key team member that assists in design and development of the data pipeline Ø Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Ø Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ø Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Ø Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Ø Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Ø Implement data security and privacy measures to protect sensitive data Ø Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Ø Collaborate and communicate effectively with product teams Ø Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Ø Identify and resolve complex data-related challenges Ø Adhere to best practices for coding, testing, and designing reusable code/component Ø Explore new tools and technologies that will help to improve ETL platform performance Ø Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Basic Qualifications and Experience: Master's degree / Bachelor's degree and 5 to 9 years Computer Science, IT or related field experience Functional Skills: Must-Have Skills Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 6 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About The Company Tata Communications Redefines Connectivity with Innovation and IntelligenceDriving the next level of intelligence powered by Cloud, Mobility, Internet of Things, Collaboration, Security, Media services and Network services, we at Tata Communications are envisaging a New World of Communications Job Description Responsible for managing customer queries related to all services and solutions delivered, including diagnosing, and resolving complex technical issues in respective areas of Network/SDWAN/Cloud & security/Unified Collaboration/Mobility & IoT/other domains. The role acts as a conduit between customers and other teams such as engineering, architecture etc. for any issue resolution. This is an operational role, responsible for delivering results that have a direct impact on day-to-day operations and capable of instructing professional or technical staff and reviewing the quality of the work undertaken by these roles. Responsibilities Technical administration or troubleshooting to ensure the efficient functionality of the solution. Incident Validation, Incident Analysis, Solution recommendation Assists with the development, revision, and maintenance of Standard Operating Procedures and Working Instructions Act as a point of escalation for Level-1 customer service analysts Coordinate with IT teams on escalations, tracking, performance issues, and outages. Prepare Monthly Executive Summary Reports for managed clients and continuously improve their content and presentation. Provide recommendations in tuning and optimization of systems, processes, procedures, and policies. Maintain an inventory of the procedures used by the operations team and regularly evaluate the procedures and add, remove, and update the procedures as appropriate. Publish weekly reports and monthly reports on customer service operations activity. Desired Skill sets Good knowledge on implementation, installation, integration troubleshooting and overall functionalities Experience in troubleshooting platform related issues, data backup, restoration, retention Maintains awareness of latest technologies in the domain Position Summary(Key Objectives/Purpose of the Job)*----- Manager - Networks(CSO-RF SOC)—This position is responsible for handling All RF Link deliveries with in TAT which includes L-2/L3 escalation. Major Responsibilities*---- Ensuring Smooth & Proper Deliveries of All RF link with in TAT. Ensuring Set Quality parameters during delivery of All RF links. Ensuring set procedures be followed by the team while delivering All RF links Support to team in Troubleshooting for any issues(Non reachability of WAN, LM Latency, PDs in WAN response or Throughput issues etc). Coordination with various stake-holders involved during end to end delivery of RF links. Ensuring Generation & Publication of error free MIS reports on schedules. Management of manpower with ASOC ONNET & WAN Aggregator Desk for smooth functioning of the team. To provide required Training & Guidance to team to manage the task at hand. Key Performance Indicators*--- Meeting Service Assurance KPIs such as TR/RE/Cost Saving/NPS/Quality / Process Compliance etc. Qualification – Necessary requirement--- Diploma or Graduate in Engineering. Preferably Electronics and Communications Engineering. Good understanding of OSS/BSS tools like M6, Optimus etc. Working knowledge and hands-on experience of Network Fundamentals, Routing Protocols and Switching. Having Good knowledge on IP addressing / routing / switching Concepts. Good knowledge of RF-Fundamentals. Good Knowledge of WIMAX-802.16d. Field experience in RADWIN and PTP products Good Knowledge of Cellular network and the Aggregation of network media solutions. Good Knowledge of Advanced Ms Excel/MS PowerPoint for MIS processing. Excellent Vendor coordination & follow-up skills…. Leadership and Behavioral ----- Leadership qualities Required. Should be willing to accept challenges and highly dynamic in nature. Good aptitude to learn new technology & solution Systematic approach towards resolution of an issue . Should have excellent written & verbal communication skills. Good Analytical, Diagnostic and Problem Solving skills, Customer Centricity, Dealing with ambiguity and pressure. Will be able to lead a team of 20-30 persons

Posted 6 days ago

Apply

7.0 years

0 Lacs

India

On-site

Job Description Experience: 7+ years of relevant PostgreSQL DBA Experience. Good knowledge of PostgreSQL Database Architecture. Installing, configuring PostgreSQL using source code, RPM, one-click installation on-prem and Cloud Technologies. Strong experience in handling Database Logical and Physical Backups. Hands on experience on restoration techniques on PostgreSQL like pg_restore and Point in time recovery (PITR). Expertise in Applying database patches from lower version to higher version. Major and Minor upgrade of PostgreSQL DB's on multiple platforms (pg_dump/pg_restore & pg_upgrade). Implemented PostgreSQL DR servers and PostgreSQL load balancing using pgpool. Expertise in Streaming Replication (Including Cascading Replication). Configuring pgbouncer for connection pooling. Configuring pgbadger for generating statistic report on the basis PostgreSQL log file. Handling Database corruption issues. Have done Oracle to PostgreSQL Migration for multiple clients. Strong knowledge on PostgreSQL configuration parameter for tuning the DB health. Managed Users and Tablespace on PostgreSQL Servers. Configuring Heterogeneous DB connections between Oracle, PostgreSQL. Expertise in Shell scripts for performing Online Backup and maintenance activity. Strong experience in setting up RDS & Aurora Cluster. Strong experience in Query Tuning and performance improvement. Tuning parameter group and configuring Aurora cluster for Read Replica. Configuring RDS & Aurora PostgreSQL Logs to push to S3 Bucket. Experience on Reviewing performance Metrics and query tuning on Aurora PostgreSQL. Configured Query Store and Analytics workspace on MS Azure Cloud. About Us Datavail is a leading provider of data management, application development, analytics, and cloud services, with more than 1,000 professionals helping clients build and manage applications and data via a world-class tech-enabled delivery platform and software solutions across all leading technologies. For more than 17 years, Datavail has worked with thousands of companies spanning different industries and sizes, and is an AWS Advanced Tier Consulting Partner, a Microsoft Solutions Partner for Data & AI and Digital & App Innovation (Azure), an Oracle Partner, and a MySQL Partner. About The Team Datavail’s Team of PostgreSQL Experts Can Save You Time and Money We have extensive experience with just about everything in PostgreSQL. Our consultants, architects, DBAs, and database development team have extensive hands-on experience in traditional on-prem PostgreSQL and in Cloud too. Our experts have an average of 15 years of experience. They’ve overcome every obstacle in helping clients manage everything from databases, analytics, reporting, migrations, and upgrades to monitoring and overall data management. You can free up your IT resources to focus on growing your business rather than fighting fires. Our PostgreSQL experts can guide you through strategic initiatives or support routine database management. Datavail’s Comprehensive PostgreSQL Database Services Datavail offers PostgreSQL consulting services that allow you to take advantage of all the features of the PostgreSQL database. By providing high availability solutions, building high end database systems, architectural support, and managed service support our team can ensure optimal performance for your database on-prem or in the cloud. PostgreSQL Database Managed Services Datavail’s business focuses on helping you use your data to drive business results through cost-saving services. The success of your business depends on how well you understand and manage your data. Our managed cloud services give you the power to unleash your organization’s potential. We provide comprehensive and technically advanced support for PostgreSQL installations to ensure that your databases are safe, secure, and managed with the utmost level of care. Our delivery performance in data management leads the industry. We offer highly trained PostgreSQL database administrators via a 24×7, always on, always available, global delivery model. With the combination of a proven delivery model and top-notch experience ensures that Datavail will remain the database experts on demand you desire. Datavail’s flexible and client focused services always add value to your organization. Are you a seasoned PostgreSQL Database Administrator? Does working in a multi customer, multi domain environment on global scale motivates you? If yes, this role could be a good fit for you. Datavail is seeking a highly skilled, self-motivated, and passionate PostgreSQL DBA to join our PostgreSQL Global Practice. As a PostgreSQL DBA, you will be working on the latest Opensource technologies in the industry. This position is based out of our global delivery centers in Mumbai, Hyderabad, and Bangalore in India. Datavail is one of the largest data-focused services company in North America and provides both professional and managed services and expertise in Database Management, Application Development and Management, Cloud & Infrastructure Management, Packaged Applications and BI/Analytics. Why should you work at Datavail? Learn from a vast pool of global PostgreSQL DBAs with over 500 combined years of industry experience. You would be working with multiple customers in a multi domain environments.Work range: On-Prem core PostgreSQL DBs, AWS RDS & Aurora, and Azure - PostgreSQL. Your Certifications would be on us. Future Opportunity for permanent deputation on H1B to work in US. Leverage the DV Training program to upscale the technical skills.

Posted 6 days ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Amgen Amgen harnesses the best of biology and technology to fight the world's toughest diseases, and make people's lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what fs known today. About The Role Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect From You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 2 to 4 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores. Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good-to-Have Skills: Experience with data modeling, performance tuning on relational and graph databases ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting As an Associate Data Engineer at Amgen, you will be involved in the development and maintenance of data infrastructure and solutions. You will collaborate with a team of data engineers to design and implement data pipelines, perform data analysis, and ensure data quality. Your strong technical skills, problem-solving abilities, and attention to detail will contribute to the effective management and utilization of data for insights and decision-making.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About AiSensy AiSensy is a WhatsApp based Marketing & Engagement platform helping businesses like Adani, Delhi Transport Corporation, Yakult, Godrej, Aditya Birla Hindalco, Wipro, Asian Paints, India Today Group, Skullcandy, Vivo, Physicswallah, and Cosco grow their revenues via WhatsApp. Enabling 100,000+ Businesses with WhatsApp Engagement & Marketing 400 Crores+ WhatsApp Messages exchanged between Businesses and Users via AiSensy per year Working with top brands like Delhi Transport Corporation, Vivo, Physicswallah & more High Impact as Businesses drive 25-80% Revenues using AiSensy Platform Mission-Driven and Growth Stage Startup backed by Marsshot.vc, Bluelotus.vc & 50+ Angel Investors Role Overview We are looking for a Senior Machine Learning Engineer to lead the development and deployment of cutting-edge AI/ML systems with a strong focus on LLMs, Retrieval-Augmented Generation (RAG), AI agents , and intelligent automation. You will work closely with cross-functional teams to translate business needs into AI solutions, bringing your expertise in building scalable ML infrastructure, deploying models in production, and staying at the forefront of AI innovation. Key Responsibilities AI & ML System Development Design, develop, and optimize end-to-end ML models using LLMs, transformer architectures, and custom RAG frameworks. Fine-tune and evaluate generative and NLP models for business-specific applications such as conversational flows, auto-replies, and intelligent routing. Lead prompt engineering and build autonomous AI agents capable of executing multi-step reasoning. Infrastructure, Deployment & MLOps Architect and automate scalable training, validation, and deployment pipelines using tools like MLflow, SageMaker, or Vertex AI. Integrate ML models with APIs, databases (vector/graph), and production services ensuring performance, reliability, and traceability. Monitor model performance in real-time, and implement A/B testing, drift detection, and re-training pipelines. Data & Feature Engineering Work with structured, unstructured, and semi-structured data (text, embeddings, chat history). Build and manage vector databases (e.g., Pinecone, Weaviate) and graph-based retrieval systems. Ensure high-quality data ingestion, feature pipelines, and scalable pre-processing workflows. Team Collaboration & Technical Leadership Collaborate with product managers, software engineers, and stakeholders to align AI roadmaps with product goals. Mentor junior engineers and establish best practices in experimentation, reproducibility, and deployment. Stay updated on the latest in AI/ML (LLMs, diffusion models, multi-modal learning), and drive innovation in applied use cases. Required Qualifications Bachelor’s/Master’s degree in Computer Science, Engineering, AI/ML, or a related field from a Tier 1 institution (IIT, NIT, IIIT or equivalent). 2+ years of experience building and deploying machine learning models in production. Expertise in Python and known the frameworks like TensorFlow, PyTorch, Hugging Face, Scikit-learn . Hands-on experience with transformer models , LLMs , LangChain , LangGraph, OpenAI API , or similar. Deep knowledge of machine learning algorithms , model evaluation, hyperparameter tuning, and optimization. Experience working with cloud platforms (AWS, GCP, or Azure) and ML Ops tools (MLflow, Airflow, Kubernetes). Strong understanding of SQL , data engineering concepts, and working with large-scale datasets. Preferred Qualifications Experience with prompt tuning, agentic AI systems , or multi-modal learning . Familiarity with vector search systems (e.g., Pinecone, FAISS, Milvus) and knowledge graphs . Contributions to open-source AI/ML projects or publications in AI journals/conferences. Experience in building conversational AI or smart assistants using WhatsApp or similar messaging APIs. Why Join AiSensy? Build AI that directly powers growth for 100,000+ businesses. Work on cutting-edge technologies like LLMs, RAG, and AI agents in real production environments. High ownership, fast iterations, and impact-focused work. Ready to build intelligent systems that redefine communication? Apply now and join the AI revolution at AiSensy .

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting – Advanced PL/SQL - Technical Consultant-Manager The opportunity We’re looking for PLSQL Technical Consultant(Manager) to join the leadership group of our Consulting Team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. Your Key Responsibilities Support client needs in Oracle application development/support using Advanced PL/SQL Proactive with Solution oriented mindset, ready to learn new technologies for Client requirements Working knowledge of a source code control system Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Skills And Attributes For Success SQL Concepts – Group By / Joins/ Surrogate Keys / Constraints/ Datatypes / Indexes Experience in designing and developing Oracle objects such as Experience with SQL queries (Scalar subqueries, inline views, outer joins, hints, and analytic functions Experience of creating PL/SQL Packages, Procedures, Functions, Triggers, Views, Collections and Exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning, Hints and Indexing strategy for optimal performance Experience in Direct Load & Conventional Load using SQL Loader Should have good understanding Oracle Database Architecture Prior consulting experience preferred To qualify for the role, you must have Computer Science degree or equivalent 7 to 11 years of development experience Strong analytical and debugging skills Good understanding of advanced PL/SQL and performance tuning Should have positive and learning attitude, Good Communication, Strong Analytical skills What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting – Advanced PL/SQL - Technical Consultant(Manager) The opportunity We’re looking for PLSQL Technical Consultant(Manager) to join the leadership group of our Consulting Team. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. Your Key Responsibilities Support client needs in Oracle application development/support using Advanced PL/SQL Proactive with Solution oriented mindset, ready to learn new technologies for Client requirements Working knowledge of a source code control system Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Skills And Attributes For Success SQL Concepts – Group By / Joins/ Surrogate Keys / Constraints/ Datatypes / Indexes Experience in designing and developing Oracle objects such as Experience with SQL queries (Scalar subqueries, inline views, outer joins, hints, and analytic functions Experience of creating PL/SQL Packages, Procedures, Functions, Triggers, Views, Collections and Exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning, Hints and Indexing strategy for optimal performance Experience in Direct Load & Conventional Load using SQL Loader Should have good understanding Oracle Database Architecture Prior consulting experience preferred To qualify for the role, you must have Computer Science degree or equivalent 7 to 11 years of development experience Strong analytical and debugging skills Good understanding of advanced PL/SQL and performance tuning Should have positive and learning attitude, Good Communication, Strong Analytical skills What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description - Senior Database Administrator Job Summary: We are seeking a highly skilled and proactive Senior Database Administrator to join our Data Engineering team. In this critical role, you will administer, optimize, and maintain our diverse data platforms across a hybrid environment, including on-premises SQL Server databases, Snowflake, and Azure Cloud services. You will be responsible for deploying migrations, ensuring server health, troubleshooting issues, applying service packs and hotfixes, managing and monitoring server activities, and performing routine database maintenance. Additionally, you will collaborate closely with developers by providing design guidance, best practices, and performance considerations for SQL Server and Snowflake technologies. Roles and Responsibilities: · Monitor and troubleshoot database performance using tools such as SolarWinds DPA, Dynatrace, and other vendor monitoring solutions. Identify and resolve processing, memory, I/O, space, and other resource contention issues in real time. · Plan, commission, and deploy database system changes in collaboration with platform and infrastructure teams. Manage server commissioning, maintenance, problem analysis, and capacity planning to support database growth. · Manage and monitor ETL processes using Bryteflow, SQL Server Change Tracking, SQL Server Replication, DBT, Tidal, and Airflow. · Ensure high availability, performance, sustainability, and security of over 100 SQL Server and Azure SQL instances across production and non-production environments. Provide Snowflake performance tuning and optimization recommendations. · Assist developers with query tuning, schema design, and performance improvements. · Respond promptly to backup/restore, security, change management, and deployment requests from Developers, QA teams, Business Analysts, and end users. · Automate routine processes, track issues, and maintain comprehensive documentation. · Provide 24x7 support for critical production systems, including after-hours maintenance and release deployments. · Design, implement, and support SQL Server technologies including Analysis Services (SSAS), Reporting Services (SSRS), Integration Services (SSIS), Transact-SQL scripting, and batch development. · Perform essential maintenance tasks such as backups/restores, index rebuilds, statistics updates, and monitoring of logs and database sizes. Required Qualifications: · Bachelor’s degree in computer science, Information Technology, or a related field. · Minimum of 12+ years of hands-on experience as a Database Administrator with MS SQL Server (2016 and newer), Snowflake, and Azure SQL. · Proven expertise in Performance Tuning and Optimization (PTO) for complex database environments. · Demonstrated project management skills with the ability to lead initiatives and collaborate across teams. · Experience with version control and time management tools such as TFS and Git. Job Location : Hyderabad, Chennai Candidates can share your updated resume at vishnu.gadila@cesltd.com

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies