Home
Jobs

558 Composer Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

India

Remote

Linkedin logo

Company Description Miratech helps visionaries change the world. We are a global IT services and consulting company that brings together enterprise and start-up innovation. Today, we support digital transformation for some of the world's largest enterprises. By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. We are a values-driven organization and our culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed by meeting or exceeding our scope, schedule, and/or budget objectives since our inception in 1989. Miratech has coverage across 5 continents and operates in over 25 countries around the world. Miratech retains nearly 1000 full-time professionals, and our annual growth rate exceeds 25%. Job Description Miratech as a trusted partner seeks a CCAI BOT Developer to join our team remotely. This project focuses on developing and implementing advanced conversational AI solutions using the Google CCAI Bot framework. Scrum teams, including IVR and chatbot developers, collaborate to build intelligent voice bots and chatbots that enhance customer interactions in contact centers. The project integrates NLP, NLU, and machine learning technologies with backend systems, databases, and APIs to create scalable, high-performance solutions. It utilizes CI/CD pipelines, agile methodologies, and enterprise-scale technologies like Google Dialogflow, Genesys, and Nuance Mix Tools. Developers also work with REST-based microservices and automated testing to ensure reliability and continuous improvement of the chatbot ecosystem. Responsibilities: Design, develop, and deploy chatbots and voicebots using leading Conversational AI platforms such as Microsoft Bot Framework and Google Dialogflow. Write clean, efficient, and maintainable code following industry best practices and standards. Develop custom components and tools to enhance chatbot functionality, performance, and user experience. Collaborate with cross-functional teams, including developers, designers, and stakeholders, to align chatbot solutions with project goals and user needs. Utilize NLP and ML techniques, including TTS, STT, and SSML, to enable intelligent and context-aware chatbot interactions. Integrate chatbot systems with backend infrastructure, databases, and APIs to ensure seamless data flow and interaction. Troubleshoot and resolve technical issues by analyzing logs, debugging code, and implementing continuous improvements. Stay updated with emerging trends and advancements in chatbot development, AI, and Conversational UI technologies. Qualifications 4+ years of experience with the Google CCAI Bot framework, Dialogflow ES/CX, and Conversational AI technologies, including NLP, NLU, and ML. 4+ years of experience in IVR application development, including Nuance grammar development, GRAT, and GRE. Proficiency in Core Java, Java/J2EE, Servlets, JSP, and REST-based microservices. Expertise in web services integration, including working with SQL databases, relational databases, and RESTful APIs. Experience with Google, Genesys, and related technologies, including GVP, Nuance Mix Tools, and Genesys Composer. Hands-on experience with Git, Jenkins, Maven, and automated testing methodologies. Strong understanding of agile development and Scrum best practices. Strong analytical skills for resolving technical issues in complex, distributed environments. Experience with the Spring framework and familiarity with Tomcat or similar web servers. Bachelor’s degree in a technology-related field or equivalent professional experience. We offer: Culture of Relentless Performance : join an unstoppable technology development team with a 99% project success rate and more than 30% year-over-year revenue growth. Competitive Pay and Benefits : enjoy a comprehensive compensation and benefits package, including health insurance, language courses, and a relocation program. Work From Anywhere Culture : make the most of the flexibility that comes with remote work. Growth Mindset : reap the benefits of a range of professional development opportunities, including certification programs, mentorship and talent investment programs, internal mobility and internship opportunities. Global Impact : collaborate on impactful projects for top global clients and shape the future of industries. Welcoming Multicultural Environment : be a part of a dynamic, global team and thrive in an inclusive and supportive work environment with open communication and regular team-building company social events. Social Sustainability Values : join our sustainable business practices focused on five pillars, including IT education, community empowerment, fair operating practices, environmental sustainability, and gender equality. * Miratech is an equal opportunity employer and does not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other protected status under applicable law. Show more Show less

Posted 1 week ago

Apply

2.0 - 5.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Linkedin logo

Area(s) of responsibility Skill Set 2 to 5 years’ experience in PTC Windchill and Thingworx Customization & Configuration. Experienced in: Solution Design, Windchill Customization Debugging, Windchill Development Fundamentals, Documentation, Software Testing, Software Maintenance, Software Performance Tuning, Strong product development methodology and tools experience including agile methods, source management, problem resolution, automated testing, DevOps, CICD, GITHUB, SVN etc. Technical competences: (Required) Windchill Application Skilled in basic and advanced Java, Webservices, JavaScript, Shell scripting, SQL, HTML, CSS. Knowledge of Windchill implementation in basic modules is must Very skilled in PTC Windchill - PDM Link customization, XML, Database(SQL) programming In depth knowledge and good experience in JAVA, J2EE, JSP, Java Script Good understanding of basic PLM processes like BOM Management, Part Management, Document Management, EBOM, MBOM. Thingworx Skilled in basic and advanced Java, Webservices, JavaScript, Shell scripting, REST, HTML, CSS. Experience in Complete PTC’s complete Thingworx suite including but not limited to Navigate (10.x, 11.x and above), Thingworx foundation, Thingworx flow etc Knowledge of ThingWorx Composer for building the Thingworx applications Show more Show less

Posted 1 week ago

Apply

2.0 - 4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Bachelor’s degree in Computer Science, Information Technology, or related field 2-4 years of hands-on experience with Laravel and Angular (academic projects/internships also considered) Good understanding of PHP, MySQL, and RESTful APIs Familiarity with Angular concepts like components, services, routing, and reactive forms understanding of HTML, CSS, JavaScript, and TypeScript Knowledge of version control systems like Git Strong problem-solving and communication skills Eagerness to learn and adapt in a fast-paced environment Familiarity with tools like Postman, Composer, npm, or Webpack Build dynamic frontend interfaces and SPAs using Angular Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderābād

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consul tant, SFDC Developer (Experience Cloud) ! In this role you will be responsible for below: Responsibilities: Lead analysis, design, development, unit test and deployment on Salesforce platform as part of Agile cadence Build custom solutions on the platform using LWC, Aura, Apex, Visualforce Implement standard Salesforce functionality including sharing rules, roles, profiles, etc. and use standard features including Workflow, Process builder, and flows to create solutions. Execute integration testing with Apex test classes and maintain maximum code coverage. Developing trigger, batch class, and scheduled job by maintaining best coding standards Write Apex using industry standard best practices and design patterns. Maintain expert level knowledge of Salesforce system architecture and development best practices to scale implementations. Develop complex integrations between Salesforce and other systems, either through custom API or middleware tools. Work on production incident debugging, analysis, bug fixes, service requests, data loads, minor/major enhancements. Provide business support for critical issues and mentor operations team members. Qualifications we seek in you! Minimum qualifications / Skills BS/MS degree in Computer Science or related technical field involving coding or equivalent work/technical experience. Experience working on development of Experience cloud , Sales cloud and Service Cloud instances for large enterprise & multiple geography. Salesforce Platform Developer I & II Certification. Using declarative (Process builder, Flow and Workflow) versus programmatic methods and extending the Lightning Platform using Apex and Lightning web components . Experience using Apex Data loade Preferred qualifications / Skills Demonstrable experience designing, and personally building, Lightning pages for enhanced end user experiences. Experience with Salesforce sites and Communities Experience integrating Salesforce with external systems (REST & SOAP API, JSON & XML, etc.) Knowledge of Salesforce platform best practices, coding, design guidelines and governor limits Excellent communication, documentation, and organizational skills and the ability to relentlessly prioritize. Passion for a fast-paced, high growth environment Good to have experience in Conga : Conga composer for dynamic document generation, Contract life cycle, Contract approval, Electronic Signature by conga sign Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training . Job Lead Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 5, 2025, 4:30:51 AM Unposting Date Dec 2, 2025, 8:30:51 AM Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are hiring for our client - based in Hyderabad- GCC - who are yet to establish their presence in India. Job Summary : We are looking for a Senior Data Engineer to join our growing team of analytics experts. As a data engineer, you are responsible for designing and implementing our data pipeline architecture and optimizing data flow and collection for cross-functional groups, considering scalability in mind. Data engineering is about building the underlying infrastructure, and so being able to pass the limelight to someone else is imperative. Required Skills: Hands-on experience in Data Integration and Data Warehousing Strong proficiency in: Google BigQuery Python SQL Airflow/Cloud Composer Ascend or any modern ETL tool Experience with data quality frameworks or custom-built validations Preferred Skills: Knowledge of DBT for data transformation and modeling Familiarity with Collibra for data cataloging and governance Qualifications: Advanced working SQL knowledge and experience working with relational databases and working familiarity with a variety of databases. Strong analytic skills related to working with unstructured datasets. Experience building a serverless data warehouse in GCP or AWS 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Strong analytic skills related to working with unstructured datasets Responsibilities: Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Design, build, and optimize data pipelines using Google BigQuery, ensuring use of best practices such as query optimization,partitioning, clustering, and scalable data modeling. Develop robust ETL/ELT processes using Python and SQL, with an emphasis on reliability, performance, and maintainability. Create and manage Ascend or equivalent tool data flows, including: Setting up read/write connectors for various data sources. Implementing custom connectors using Python. Managing scheduling, failure notifications, and data services within Ascend. Implement data quality checks (technical and business level) and participate in defining data testing strategies to ensure data reliability. Perform incremental loads and merge operations in BigQuery. Build and manage Airflow (Cloud Composer) DAGs, configure variables, and handle scheduling as part of orchestration. Work within a CI/CD (DevSecOps) setup to promote code efficiently across environments. Participate in technical solutioning: Translate business integration needs into technical user stories. Contribute to technical design documents and provide accurate estimations. Conduct and participate in code reviews, enforce standards, and mentor junior engineers. Collaborate with QA and business teams during UAT; troubleshoot and resolve issues in development, staging, and production environments. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3 rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Show more Show less

Posted 1 week ago

Apply

100.0 years

1 - 3 Lacs

Gurgaon

On-site

Senior Data Engineer(GCP, Python) Gurgaon, India Information Technology 314204 Job Description About The Role: Grade Level (for internal use): 10 S&P Global Mobility The Role: Senior Data Engineer Department overview Automotive Insights at S&P Mobility, leverages technology and data science to provide unique insights, forecasts and advisory services spanning every major market and the entire automotive value chain—from product planning to marketing, sales and the aftermarket. We provide the most comprehensive data spanning the entire automotive lifecycle—past, present and future. With over 100 years of history, unmatched credentials, and the largest base of customers than any other provider, we are the industry benchmark for clients around the world, helping them make informed decisions to capitalize on opportunity and avoid risk. Our solutions are used by nearly every major OEM, 90% of the top 100 tier one suppliers, media agencies, governments, insurance companies, and financial stakeholders to provide actionable insights that enable better decisions and better results. Position summary S&P Global is seeking an experienced and driven Senior data Engineer who is passionate about delivering high-value, high-impact solutions to the world’s most demanding, high-profile clients. The ideal candidate must have at least 5 years of experience in developing and deploying data pipelines on Google Cloud Platform (GCP). They should be passionate about building high-quality, reusable pipelines using cutting-edge technologies. This role involves designing, building, and maintaining scalable data pipelines, optimizing workflows, and ensuring data integrity across multiple systems. The candidate will collaborate with data scientists, analysts, and software engineers to develop robust and efficient data solutions. Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines. Optimize and automate data ingestion, transformation, and storage processes. Work with structured and unstructured data sources, ensuring data quality and consistency. Develop and maintain data models, warehouses, and databases. Collaborate with cross-functional teams to support data-driven decision-making. Ensure data security, privacy, and compliance with industry standards. Troubleshoot and resolve data-related issues in a timely manner. Monitor and improve system performance, reliability, and scalability. Stay up-to-date with emerging data technologies and recommend improvements to our data architecture and engineering practices. What you will need: Strong programming skills using python. 5+ years of experience in data engineering, ETL development, or a related role. Proficiency in SQL and experience with relational (PostgreSQL, MySQL, etc.) and NoSQL (DynamoDB, MongoDB etc…) databases. Proficiency building data pipelines in Google cloud platform(GCP) using services like DataFlow, Cloud Batch, BigQuery, BigTable, Cloud functions, Cloud Workflows, Cloud Composer etc.. Strong understanding of data modeling, data warehousing, and data governance principles. Should be capable of mentoring junior data engineers and assisting them with technical challenges. Familiarity with orchestration tools like Apache Airflow. Familiarity with containerization and orchestration. Experience with version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and ability to work in a fast-paced environment. Excellent communication skills. Hands-on experience with snowflake is a plus. Experience with big data technologies is a plus. Experience in AWS is a plus. Should be able to convert business queries into technical documentation. Education and Experience Bachelor’s degree in Computer Science, Information Systems, Information Technology, or a similar major or Certified Development Program 5+ years of experience building data pipelines using python & GCP (Google Cloud platform). About Company Statement: S&P Global delivers essential intelligence that powers decision making. We provide the world’s leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, you’ll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand today’s market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 314204 Posted On: 2025-05-30 Location: Gurgaon, Haryana, India

Posted 1 week ago

Apply

2.0 years

0 - 0 Lacs

Noida

On-site

*Job Title:* Video Editor *Job Type:* Full-time/Part-time/Freelance *Location:* Noida Sector-62 *Job Description:* We're looking for a skilled Video Editor to join our team! As a Video Editor, you'll be responsible for editing video content for various projects, including promotional videos, social media clips, and more. If you have a keen eye for detail and a passion for storytelling, we'd love to hear from you! *Responsibilities:* - Edit video content to create engaging and visually appealing stories - Collaborate with the team to understand project requirements and goals - Work with raw footage to create a cohesive narrative - Add music, sound effects, and graphics to enhance the video - Ensure all edited content meets brand guidelines and quality standards - Stay up-to-date with industry trends and best practices *Requirements:* - 2+ years of experience in video editing - Proficiency in video editing software such as Adobe Premiere Pro, Final Cut Pro, or Avid Media Composer - Strong understanding of storytelling and visual aesthetics - Excellent attention to detail and organizational skills - Ability to work under tight deadlines and collaborate with a team *Nice to Have:* - Experience with motion graphics and animation - Knowledge of color grading and sound design - Familiarity with cloud-based collaboration tools *What We Offer:* - Competitive salary and benefits package - Opportunity to work on exciting projects with a talented team - Professional development and growth opportunities Job Types: Full-time, Freelance Pay: ₹10,000.00 - ₹25,000.00 per month Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Work: 2 years (Preferred) Work Location: In person

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Job Description Miratech as a trusted partner seeks a CCAI BOT Developer to join our team remotely. This project focuses on developing and implementing advanced conversational AI solutions using the Google CCAI Bot framework. Scrum teams, including IVR and chatbot developers, collaborate to build intelligent voice bots and chatbots that enhance customer interactions in contact centers. The project integrates NLP, NLU, and machine learning technologies with backend systems, databases, and APIs to create scalable, high-performance solutions. It utilizes CI/CD pipelines, agile methodologies, and enterprise-scale technologies like Google Dialogflow, Genesys, and Nuance Mix Tools. Developers also work with REST-based microservices and automated testing to ensure reliability and continuous improvement of the chatbot ecosystem. Responsibilities: Design, develop, and deploy chatbots and voicebots using leading Conversational AI platforms such as Microsoft Bot Framework and Google Dialogflow. Write clean, efficient, and maintainable code following industry best practices and standards. Develop custom components and tools to enhance chatbot functionality, performance, and user experience. Collaborate with cross-functional teams, including developers, designers, and stakeholders, to align chatbot solutions with project goals and user needs. Utilize NLP and ML techniques, including TTS, STT, and SSML, to enable intelligent and context-aware chatbot interactions. Integrate chatbot systems with backend infrastructure, databases, and APIs to ensure seamless data flow and interaction. Troubleshoot and resolve technical issues by analyzing logs, debugging code, and implementing continuous improvements. Stay updated with emerging trends and advancements in chatbot development, AI, and Conversational UI technologies. Qualifications 4+ years of experience with the Google CCAI Bot framework, Dialogflow ES/CX, and Conversational AI technologies, including NLP, NLU, and ML. 4+ years of experience in IVR application development, including Nuance grammar development, GRAT, and GRE. Proficiency in Core Java, Java/J2EE, Servlets, JSP, and REST-based microservices. Expertise in web services integration, including working with SQL databases, relational databases, and RESTful APIs. Experience with Google, Genesys, and related technologies, including GVP, Nuance Mix Tools, and Genesys Composer. Hands-on experience with Git, Jenkins, Maven, and automated testing methodologies. Strong understanding of agile development and Scrum best practices. Strong analytical skills for resolving technical issues in complex, distributed environments. Experience with the Spring framework and familiarity with Tomcat or similar web servers. Bachelor’s degree in a technology-related field or equivalent professional experience. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Hello Folks! We’re Hiring – Senior Data Engineer (GCP) || Hyderabad || Chennai || Bengaluru We are hiring for a product-based company for permanent roles! Join our innovative team and contribute to cutting-edge solutions. Job Title: Senior Data Engineer // principal Data engineer (gcp data engineer) What you’ll be doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Company. Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What we’re looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree or four or more years of work experience. Four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in any DBMS Experience in Shell scripting, Spark, Scala. Experience in GCP/BigQuery, composer, Airflow. Experience in real time streaming Experience :4+years (mandatory) Location: Hyderabad, Bengaluru, chennai Work Mode: Hydrid Notice period: 60 days max In case intrested please share ur cv to Ramanjaneya.m@technogenindia.com Show more Show less

Posted 1 week ago

Apply

50.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Your Team Responsibilities Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark/Databricks/bigquery/Airflow/composer. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Your Key Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views/MV. Participate in data migration projects and understand technologies like Delta Lake/warehouse/bigquery. Debug and solve complex problems in data pipelines and processes. Your Skills And Experience That Will Help You Excel Bachelor’s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice To Have Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Experience Level: 4 to 6 years of relevant IT experience Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: ● Design, develop, test, and maintain scalable ETL data pipelines using Python. ● Work extensively on Google Cloud Platform (GCP) services such as: ○ Dataflow for real-time and batch data processing ○ Cloud Functions for lightweight serverless compute ○ BigQuery for data warehousing and analytics ○ Cloud Composer for orchestration of data workflows (based on Apache Airflow) ○ Google Cloud Storage (GCS) for managing data at scale ○ IAM for access control and security ○ Cloud Run for containerized applications ● Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. ● Implement and enforce data quality checks, validation rules, and monitoring. ● Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. ● Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. ● Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. ● Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: ● 4–6 years of hands-on experience in Python for backend or data engineering projects. ● Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). ● Solid understanding of data pipeline architecture, data integration, and transformation techniques. ● Experience in working with version control systems like GitHub and knowledge of CI/CD practices. ● Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): ● Experience working with Snowflake cloud data platform. ● Hands-on knowledge of Databricks for big data processing and analytics. ● Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: ● Excellent problem-solving and analytical skills. ● Strong communication skills and ability to collaborate in a team environment. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Linkedin logo

🚀 We're Hiring! | Drupal 10 & 11 Developer 📍 Location: On-site at Indian Institute of Space Science and Technology (IIST), Thiruvananthapuram 🕒 Employment Type: Full-Time 💰 Salary: ₹45,000 – ₹50,000/month (negotiable for the right candidate) About the Role: We’re looking for a talented and experienced Drupal 10 & 11 Developer to join our team for an exciting on-site opportunity at IIST Thiruvananthapuram . In this role, you'll take ownership of developing, upgrading, and maintaining high-performance Drupal-based websites and applications, working closely with a cross-functional team including designers, developers, and project managers. Key Responsibilities: 🔹 Develop and maintain Drupal 10 & 11 websites (including custom modules and themes) 🔹 Migrate existing Drupal sites (e.g., Drupal 9 → 10 or 10 → 11) 🔹 Create responsive and accessible front-end experiences 🔹 Optimize front-end code for performance and industry standards 🔹 Work with Drupal core APIs, hooks, Composer, Drush, and Git 🔹 Implement third-party APIs and ensure seamless integrations Qualifications & Experience: 🎓 Education: Bachelor’s/Diploma in Computer Science, IT, or MCA 💼 Experience: 2+ years in Drupal development (7/8/9/10); exposure to Drupal 11 is a strong plus 💡 Technical Skills: Strong knowledge of PHP, MySQL, HTML5, CSS3, JavaScript Solid grasp of Drupal’s hook system and configuration management Familiarity with modern development tools like Composer, Drush, Git Soft Skills: ✔ Strong problem-solving and analytical abilities ✔ Excellent communication and collaboration skills ✔ Ability to work independently and handle multiple tasks 📩 How to Apply: Send your updated resume (with detailed project descriptions) to hr@cygnusadvertising.in . Please mention “Drupal 10 & 11 Developer” in the subject line. Ready to build impactful digital experiences with us? Let’s connect! #DrupalDeveloper #DrupalJobs #HiringNow #WebDevelopment #IIST #ThiruvananthapuramJobs #TechCareers #Drupal10 #Drupal11 #Frontend #PHP #MySQL #DeveloperJobs Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

At NICE, we don’t limit our challenges. We challenge our limits. Always. We’re ambitious. We’re game changers. And we play to win. We set the highest standards and execute beyond them. And if you’re like us, we can offer you the ultimate career opportunity that will light a fire within you. What’s the role all about? Serve as one of the top-performing and most proficient engineers in designing, producing, and testing high-quality software that meets specified functional and non-functional requirements within the time and resource constraints given. How will you make an impact? Develop software feature(s) according to requirements specifications. Develop Innovative ideas to address complex issues or future functionality. Provide optimum solutions to complex problems by implementing best industry practices. Ensure the intended design and quality levels are met through regular code reviews and testing of the software in-development. Lead the end-to-end implementation and support of the software through leading by example to ensure complete quality coverage and high degrees of responsiveness to any issues that come up throughout the complete lifecycle of the software. Planning and performing unit testing to ensure fit to design/requirements and perform automation. Possess good communication and presentation skills. Provide L3 support for issues raised in Prod and lower environments. Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute. 2+ years of application programming experience. Extensive experience in PHP v 8.x, MySQL and willing to learn new technologies. Good to have knowledge in AWS services and tools (Elastic Search, Redis, RabbitMQ, Jenkins, S3, Doctrine, Slim). Experience in development using PHP frameworks like Symfony, Slim, CodeIgniter, Laravel. Development experience in JavaScript, TypeScript, Angular/ReactJS, will be added plus. Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP. Experience in building applications using tools like PHP Unit, Docker, Composer. In-depth experience in GIT commands and skillful usage of GIT source control. Well versed with CI/CD pipelines (Jenkins, Ansible, GitHub Actions) Good understanding of design patterns and experience in implementing the same (Factory, Adapter). Experience in developing with REST API, API Authorization & Micro services. Worked in high performance, highly available and scalable Enterprise applications. Strong knowledge of OOAD and Design patterns. Development experience building solutions that leverage SQL and NoSQL databases. Experience designing and developing scalable multi-tenant SaaS-based solutions. You will have an advantage if you also have: Bring a culture of Innovation to the job. Exposure to designing serverless applications in Cloud (AWS) Experience in building applications in Contact Center Domain. Experience in leveraging Gen AI services while providing solutions. Rasa What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7028 Reporting into: Tech Manager Role Type: Individual Contributor About NICE NICE Ltd. (NASDAQ: NICE) software products are used by 25,000+ global businesses, including 85 of the Fortune 100 corporations, to deliver extraordinary customer experiences, fight financial crime and ensure public safety. Every day, NICE software manages more than 120 million customer interactions and monitors 3+ billion financial transactions. Known as an innovation powerhouse that excels in AI, cloud and digital, NICE is consistently recognized as the market leader in its domains, with over 8,500 employees across 30+ countries. NICE is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, age, sex, marital status, ancestry, neurotype, physical or mental disability, veteran status, gender identity, sexual orientation or any other category protected by law. Show more Show less

Posted 1 week ago

Apply

4.0 - 5.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

About Sun Pharma: Sun Pharmaceutical Industries Ltd. (Sun Pharma) is the fourth largest specialty generic pharmaceutical company in the world with global revenues of US$ 5.4 billion. Supported by 43 manufacturing facilities, we provide high-quality, affordable medicines, trusted by healthcare professionals and patients, to more than 100 countries across the globe. Job Summary EDMS Development and Configuration specialist will be responsible for the successful development, deployment, configuration, and ongoing support of EDMS 21.2. This role requires a deep understanding of EDMS LSQM workflows, strong technical skills, and the ability to work closely with cross-functional teams to ensure the EDMS meets the needs of the organization. Roles and Responsibilities • Assist in the development and maintenance of Documentum D2 LSQM application, including custom workflows and document management solutions. • Collaborate with senior developers to understand requirements and translate them into technical specifications. • Support the testing and debugging of Documentum applications to ensure high-quality output and performance. • Document development processes and maintain accurate technical documentation. • Solid understanding of content management principles and best practices, with experience in implementing Documentum solutions in enterprise environments. • Familiarity with Java, SQL, and web services integration for developing Documentum applications. • Expertise in Documentum platform and its components, including Documentum Content Server and Documentum Webtop. • Proficiency in using development tools such as Documentum Composer and Documentum Administrator. • Experience with version control systems (e.g., Git) and agile development methodologies. Qualifications and Preferences Qualifications: • Bachelor's degree in Information Technology, or a related field. • Minimum of 4-5 years of experience in EDMS LSQM configuration, preferably in a pharmaceutical or biotech environment. • Strong understanding of Category 1, Category 2 & 3 workflows. • Proficiency in Documentum LSQM software. • Ability to manage multiple tasks and projects simultaneously. • Strong analytical and problem-solving skills. • Excellent communication and interpersonal skills. Prefereed Qualifications: • Advanced degree in Information Technology or a related field. • Experience with database management and DQL. • Understanding of Documentum Content Server and its APIs. • Familiarity with Documentum DQL (Documentum Query Language). • Experience in Documentum development, including proficiency in Documentum Foundation Classes (DFC) and Documentum Query Language (DQL). • Basic knowledge of RESTful services and web development principles. Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on Sun Pharma Candidates selected after the screening rounds will be processed further by Sun Pharma Show more Show less

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Impetus Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Experience- 3-8 years Location- Gurgaon & Bangalore Job Description You should have extensive production experience in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles & Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Show more Show less

Posted 1 week ago

Apply

0.0 - 2.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

*Job Title:* Video Editor *Job Type:* Full-time/Part-time/Freelance *Location:* Noida Sector-62 *Job Description:* We're looking for a skilled Video Editor to join our team! As a Video Editor, you'll be responsible for editing video content for various projects, including promotional videos, social media clips, and more. If you have a keen eye for detail and a passion for storytelling, we'd love to hear from you! *Responsibilities:* - Edit video content to create engaging and visually appealing stories - Collaborate with the team to understand project requirements and goals - Work with raw footage to create a cohesive narrative - Add music, sound effects, and graphics to enhance the video - Ensure all edited content meets brand guidelines and quality standards - Stay up-to-date with industry trends and best practices *Requirements:* - 2+ years of experience in video editing - Proficiency in video editing software such as Adobe Premiere Pro, Final Cut Pro, or Avid Media Composer - Strong understanding of storytelling and visual aesthetics - Excellent attention to detail and organizational skills - Ability to work under tight deadlines and collaborate with a team *Nice to Have:* - Experience with motion graphics and animation - Knowledge of color grading and sound design - Familiarity with cloud-based collaboration tools *What We Offer:* - Competitive salary and benefits package - Opportunity to work on exciting projects with a talented team - Professional development and growth opportunities Job Types: Full-time, Freelance Pay: ₹10,000.00 - ₹25,000.00 per month Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Work: 2 years (Preferred) Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Summary We are looking for a highly skilled and experienced Technical Team Leader with deep expertise in Laravel and PHP frameworks. The ideal candidate will have a minimum of 5 years of experience in web development and will be responsible for leading a team of developers, ensuring code quality, and driving the technical success of projects. Experience or exposure to mobile application development will be an added advantage. Key Responsibilities · Lead and manage a team of Laravel/PHP developers to deliver high-quality projects. · Design and develop scalable web applications using the Laravel framework. · Architect robust, secure, and scalable PHP-based applications. · Conduct code reviews, mentor team members, and enforce best practices in development. · Collaborate with project managers, designers, and other teams to ensure smooth project delivery. · Manage project timelines, risks, and resource planning. · Troubleshoot and debug complex technical issues. · Stay up-to-date with Laravel and PHP trends, tools, and practices. · Ensure documentation and technical specifications are maintained. · Make informed decisions in the best interest of project execution and work independently when needed. Required Qualifications · Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. · Minimum 5 years of experience in PHP and Laravel development. · Strong understanding of OOP principles, MVC architecture, and RESTful API development. · Proficient in MySQL, HTML, CSS, JavaScript, and modern front-end frameworks (e.g., Vue.js or React). · Experience in using Git, Composer, and other development tools. · Excellent problem-solving, debugging, and analytical skills. · Strong leadership and communication abilities. Preferred Skills · Experience in mobile application development or working alongside mobile teams is a strong advantage. · Experience with Agile methodologies and tools like JIRA. · Familiarity with CI/CD pipelines and deployment strategies. · Knowledge of cloud services such as AWS or DigitalOcean. · Experience in handling client communication and technical presentations. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Summary We are looking for a highly skilled and experienced Technical Team Leader with deep expertise in Laravel and PHP frameworks. The ideal candidate will have a minimum of 5 years of experience in web development and will be responsible for leading a team of developers, ensuring code quality, and driving the technical success of projects. Experience or exposure to mobile application development will be an added advantage. Key Responsibilities · Lead and manage a team of Laravel/PHP developers to deliver high-quality projects. · Design and develop scalable web applications using the Laravel framework. · Architect robust, secure, and scalable PHP-based applications. · Conduct code reviews, mentor team members, and enforce best practices in development. · Collaborate with project managers, designers, and other teams to ensure smooth project delivery. · Manage project timelines, risks, and resource planning. · Troubleshoot and debug complex technical issues. · Stay up-to-date with Laravel and PHP trends, tools, and practices. · Ensure documentation and technical specifications are maintained. · Make informed decisions in the best interest of project execution and work independently when needed. Required Qualifications · Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. · Minimum 5 years of experience in PHP and Laravel development. · Strong understanding of OOP principles, MVC architecture, and RESTful API development. · Proficient in MySQL, HTML, CSS, JavaScript, and modern front-end frameworks (e.g., Vue.js or React). · Experience in using Git, Composer, and other development tools. · Excellent problem-solving, debugging, and analytical skills. · Strong leadership and communication abilities. Preferred Skills · Experience in mobile application development or working alongside mobile teams is a strong advantage. · Experience with Agile methodologies and tools like JIRA. · Familiarity with CI/CD pipelines and deployment strategies. · Knowledge of cloud services such as AWS or DigitalOcean. · Experience in handling client communication and technical presentations. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Engineer Lead - AEP Location: Remote Experience Required: 12–15 years overall experience 8+ years in Data Engineering 5+ years leading Data Engineering teams Cloud migration & consulting experience (GCP preferred) Job Summary: We are seeking a highly experienced and strategic Lead Data Engineer with a strong background in leading data engineering teams, modernizing data platforms, and migrating ETL pipelines and data warehouses to Google Cloud Platform (GCP) . You will work directly with enterprise clients, architecting scalable data solutions, and ensuring successful delivery in high-impact environments. Key Responsibilities: Lead end-to-end data engineering projects including cloud migration of legacy ETL pipelines and Data Warehouses to GCP (BigQuery) . Design and implement modern ELT/ETL architectures using Dataform , Dataplex , and other GCP-native services. Provide strategic consulting to clients on data platform modernization, governance, and data quality frameworks. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders. Define and enforce data engineering best practices , coding standards, and CI/CD processes. Mentor and manage a team of data engineers; foster a high-performance, collaborative team culture. Monitor project progress, ensure delivery timelines, and manage client expectations. Engage in technical pre-sales and solutioning , driving excellence in consulting delivery. Technical Skills & Tools: Cloud Platforms: Strong experience with Google Cloud Platform (GCP) – particularly BigQuery , Dataform , Dataplex , Cloud Composer , Cloud Storage , Pub/Sub . ETL/ELT Tools: Apache Airflow, Dataform, dbt (if applicable). Languages: Python, SQL, Shell scripting. Data Warehousing: BigQuery, Snowflake (optional), traditional DWs (e.g., Teradata, Oracle). DevOps: Git, CI/CD pipelines, Docker. Data Modeling: Dimensional modeling, Data Vault, star/snowflake schemas. Data Governance & Lineage: Dataplex, Collibra, or equivalent tools. Monitoring & Logging: Stackdriver, DataDog, or similar. Preferred Qualifications: Proven consulting experience with premium clients or Tier 1 consulting firms. Hands-on experience leading large-scale cloud migration projects . GCP Certification(s) (e.g., Professional Data Engineer, Cloud Architect). Strong client communication, stakeholder management, and leadership skills. Experience with agile methodologies and project management tools like JIRA. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Roorkee, Uttarakhand, India

Remote

Linkedin logo

Company Description Miratech helps visionaries change the world. We are a global IT services and consulting company that brings together enterprise and start-up innovation. Today, we support digital transformation for some of the world's largest enterprises. By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. We are a values-driven organization and our culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed by meeting or exceeding our scope, schedule, and/or budget objectives since our inception in 1989. Miratech has coverage across 5 continents and operates in over 25 countries around the world. Miratech retains nearly 1000 full-time professionals, and our annual growth rate exceeds 25%. Job Description Join the Bot Development team to work on the implementation of a new bot solution across various cloud platforms. Your main responsibility will be to develop a bot using Google Cloud Platform technologies, focusing on integrating the LLM (playbook) feature in Dialogflow CX, in addition to standard Dialogflow CX functionalities. Responsibilities: Design, develop, and deploy chatbots and voice bots utilizing leading Conversational AI platforms such as Microsoft Bot Framework, Google CCAI, Dialogflow CX Craft clean, efficient, and maintainable code adhering to industry best practices and standards. Develop custom components and tools to optimize the functionality and performance of our chatbot ecosystem. Collaborate closely with developers, designers, and other stakeholders to meet project requirements and user expectations. Leverage natural language processing (NLP), LLM and machine learning (ML) techniques, including TTS, STT, and SSML, to enable our chatbots to comprehend and respond intelligently to user inputs. Integrate chatbot systems seamlessly with backend systems, databases, and APIs to facilitate smooth data exchange and interactions. Investigate and resolve complex technical issues by analysing logs and debugging code for continuous improvement. Stay ahead of the curve by keeping up-to-date with the latest trends and advancements in chatbot development. Qualifications 4+ years of hands-on experience with the Google Cloud Contact Center AI Bot framework. Proficient in Natural Language Processing (NLP), Natural Language Understanding (NLU), Machine Learning (ML), and Conversational AI. Extensive experience with the new LLM (playbook) feature in Dialogflow CX, as well as traditional Dialogflow CX functionalities. Solid understanding of Google as a Software-as-a-Service platform. Experience with Azure Composer Bots. Proven troubleshooting and analytical skills in complex, distributed environments. Familiarity with SQL and experience working with relational databases. Experience integrating web services into applications. Proficient in Agile and Scrum development methodologies. Bachelor’s degree in a technology-related field or equivalent experience. Nice to have: Experience with programming languages such as JavaScript, Python, or Node.js. Familiarity with automated testing practices. We offer: Culture of Relentless Performance: join an unstoppable technology development team with a 99% project success rate and more than 30% year-over-year revenue growth. Competitive Pay and Benefits: enjoy a comprehensive compensation and benefits package, including health insurance, and a relocation program. Work From Anywhere Culture: make the most of the flexibility that comes with remote work. Growth Mindset: reap the benefits of a range of professional development opportunities, including certification programs, mentorship and talent investment programs, internal mobility and internship opportunities. Global Impact: collaborate on impactful projects for top global clients and shape the future of industries. Welcoming Multicultural Environment: be a part of a dynamic, global team and thrive in an inclusive and supportive work environment with open communication and regular team-building company social events. Social Sustainability Values: join oursustainable business practicesfocused on five pillars, including IT education, community empowerment, fair operating practices, environmental sustainability, and gender equality. Miratech is an equal opportunity employer and does not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other protected status under applicable law. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Lead a team of Software Engineers to design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity. What Experience You Need Bachelor's degree or equivalent experience 10+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What Could Set You Apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Company Description Miratech helps visionaries change the world. We are a global IT services and consulting company that brings together enterprise and start-up innovation. Today, we support digital transformation for some of the world's largest enterprises. By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. We are a values-driven organization and our culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed by meeting or exceeding our scope, schedule, and/or budget objectives since our inception in 1989. Miratech has coverage across 5 continents and operates in over 25 countries around the world. Miratech retains nearly 1000 full-time professionals, and our annual growth rate exceeds 25%. Job Description Miratech as a trusted partner seeks a CCAI BOT Developer to join our team remotely. This project focuses on developing and implementing advanced conversational AI solutions using the Google CCAI Bot framework. Scrum teams, including IVR and chatbot developers, collaborate to build intelligent voice bots and chatbots that enhance customer interactions in contact centers. The project integrates NLP, NLU, and machine learning technologies with backend systems, databases, and APIs to create scalable, high-performance solutions. It utilizes CI/CD pipelines, agile methodologies, and enterprise-scale technologies like Google Dialogflow, Genesys, and Nuance Mix Tools. Developers also work with REST-based microservices and automated testing to ensure reliability and continuous improvement of the chatbot ecosystem. Responsibilities: Design, develop, and deploy chatbots and voicebots using leading Conversational AI platforms such as Microsoft Bot Framework and Google Dialogflow. Write clean, efficient, and maintainable code following industry best practices and standards. Develop custom components and tools to enhance chatbot functionality, performance, and user experience. Collaborate with cross-functional teams, including developers, designers, and stakeholders, to align chatbot solutions with project goals and user needs. Utilize NLP and ML techniques, including TTS, STT, and SSML, to enable intelligent and context-aware chatbot interactions. Integrate chatbot systems with backend infrastructure, databases, and APIs to ensure seamless data flow and interaction. Troubleshoot and resolve technical issues by analyzing logs, debugging code, and implementing continuous improvements. Stay updated with emerging trends and advancements in chatbot development, AI, and Conversational UI technologies. Qualifications 3+ years of experience with the Google CCAI Bot framework, Dialogflow ES/CX, and Conversational AI technologies, including NLP, NLU, and ML. 3+ years of experience in IVR application development, including Nuance grammar development. Expertise in web services integration, including working with SQL databases, relational databases, and RESTful APIs. Experience with Google, Genesys, and related technologies, including GVP, Nuance Mix Tools. Strong understanding of agile development and Scrum best practices. Strong analytical skills for resolving technical issues in complex, distributed environments. Bachelor’s degree in a technology-related field or equivalent professional experience. Nice to have: Hands-on experience with Git, Jenkins, Maven, and automated testing methodologies. Experience with Genesys Composer We offer: Culture of Relentless Performance: join an unstoppable technology development team with a 99% project success rate and more than 30% year-over-year revenue growth. Competitive Pay and Benefits: enjoy a comprehensive compensation and benefits package, including health insurance, language courses, and a relocation program. Work From Anywhere Culture: make the most of the flexibility that comes with remote work. Growth Mindset: reap the benefits of a range of professional development opportunities, including certification programs, mentorship and talent investment programs, internal mobility and internship opportunities. Global Impact: collaborate on impactful projects for top global clients and shape the future of industries. Welcoming Multicultural Environment: be a part of a dynamic, global team and thrive in an inclusive and supportive work environment with open communication and regular team-building company social events. Social Sustainability Values: join oursustainable business practicesfocused on five pillars, including IT education, community empowerment, fair operating practices, environmental sustainability, and gender equality. Miratech is an equal opportunity employer and does not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other protected status under applicable law. Additional Information All your information will be kept confidential according to EEO guidelines. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3 rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Company Description Miratech helps visionaries change the world. We are a global IT services and consulting company that brings together enterprise and start-up innovation. Today, we support digital transformation for some of the world's largest enterprises. By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. We are a values-driven organization and our culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed by meeting or exceeding our scope, schedule, and/or budget objectives since our inception in 1989. Miratech has coverage across 5 continents and operates in over 25 countries around the world. Miratech retains nearly 1000 full-time professionals, and our annual growth rate exceeds 25%. Job Description Miratech as a trusted partner seeks a CCAI BOT Developer to join our team remotely. This project focuses on developing and implementing advanced conversational AI solutions using the Google CCAI Bot framework. Scrum teams, including IVR and chatbot developers, collaborate to build intelligent voice bots and chatbots that enhance customer interactions in contact centers. The project integrates NLP, NLU, and machine learning technologies with backend systems, databases, and APIs to create scalable, high-performance solutions. It utilizes CI/CD pipelines, agile methodologies, and enterprise-scale technologies like Google Dialogflow, Genesys, and Nuance Mix Tools. Developers also work with REST-based microservices and automated testing to ensure reliability and continuous improvement of the chatbot ecosystem. Responsibilities: Design, develop, and deploy chatbots and voicebots using leading Conversational AI platforms such as Microsoft Bot Framework and Google Dialogflow. Write clean, efficient, and maintainable code following industry best practices and standards. Develop custom components and tools to enhance chatbot functionality, performance, and user experience. Collaborate with cross-functional teams, including developers, designers, and stakeholders, to align chatbot solutions with project goals and user needs. Utilize NLP and ML techniques, including TTS, STT, and SSML, to enable intelligent and context-aware chatbot interactions. Integrate chatbot systems with backend infrastructure, databases, and APIs to ensure seamless data flow and interaction. Troubleshoot and resolve technical issues by analyzing logs, debugging code, and implementing continuous improvements. Stay updated with emerging trends and advancements in chatbot development, AI, and Conversational UI technologies. Qualifications 3+ years of experience with the Google CCAI Bot framework, Dialogflow ES/CX, and Conversational AI technologies, including NLP, NLU, and ML. 3+ years of experience in IVR application development, including Nuance grammar development. Expertise in web services integration, including working with SQL databases, relational databases, and RESTful APIs. Experience with Google, Genesys, and related technologies, including GVP, Nuance Mix Tools. Strong understanding of agile development and Scrum best practices. Strong analytical skills for resolving technical issues in complex, distributed environments. Bachelor’s degree in a technology-related field or equivalent professional experience. Nice to have: Proficiency in Core Java, Java/J2EE, Servlets, JSP, and REST-based microservices. Hands-on experience with Git, Jenkins, Maven, and automated testing methodologies. Experience with the Spring framework and familiarity with Tomcat or similar web servers. Experience with Genesys Composer We offer: Culture of Relentless Performance: join an unstoppable technology development team with a 99% project success rate and more than 30% year-over-year revenue growth. Competitive Pay and Benefits: enjoy a comprehensive compensation and benefits package, including health insurance, language courses, and a relocation program. Work From Anywhere Culture: make the most of the flexibility that comes with remote work. Growth Mindset: reap the benefits of a range of professional development opportunities, including certification programs, mentorship and talent investment programs, internal mobility and internship opportunities. Global Impact: collaborate on impactful projects for top global clients and shape the future of industries. Welcoming Multicultural Environment: be a part of a dynamic, global team and thrive in an inclusive and supportive work environment with open communication and regular team-building company social events. Social Sustainability Values: join oursustainable business practicesfocused on five pillars, including IT education, community empowerment, fair operating practices, environmental sustainability, and gender equality. Miratech is an equal opportunity employer and does not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other protected status under applicable law. Additional Information All your information will be kept confidential according to EEO guidelines. Show more Show less

Posted 1 week ago

Apply

Exploring Composer Jobs in India

India has a growing market for composer jobs, with various opportunities available for talented individuals in the music industry. Whether it's creating music for films, television, video games, or other media, composers play a vital role in shaping the overall experience for audiences. If you're considering a career in composing, here's a guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Mumbai
  2. Chennai
  3. Bangalore
  4. Hyderabad
  5. Delhi

These cities are known for their vibrant entertainment industries and often have a high demand for composers across various projects.

Average Salary Range

The average salary range for composer professionals in India can vary depending on experience and expertise. Entry-level composers can expect to earn between INR 3-5 lakhs per year, while experienced composers with a strong portfolio can earn upwards of INR 10 lakhs per year.

Career Path

In the field of composing, a typical career path may involve starting as a Junior Composer, then progressing to a Composer, Senior Composer, and eventually a Music Director or Lead Composer. As you gain more experience and recognition for your work, you may have the opportunity to work on larger projects and collaborate with well-known artists.

Related Skills

In addition to composing skills, it is beneficial for composers to have a good understanding of music theory, proficiency in music production software, excellent communication skills for collaborating with directors and producers, and the ability to work under tight deadlines.

Interview Questions

  • What inspired you to pursue a career in composing? (basic)
  • Can you walk us through your creative process when composing music for a project? (medium)
  • How do you handle feedback and revisions from clients or directors? (medium)
  • Can you discuss a challenging project you worked on and how you overcame obstacles during the composition process? (advanced)
  • How do you stay updated on current trends in the music industry and incorporate them into your work? (medium)
  • Have you ever had to compose music for a project with a tight deadline? How did you manage your time effectively? (medium)
  • Can you provide examples of different genres or styles of music you are comfortable composing? (medium)
  • How do you ensure that your music aligns with the overall vision of a project? (advanced)
  • Have you ever collaborated with other musicians or artists on a composition? How did you approach that collaboration? (medium)
  • What software or tools do you use for composing and producing music? (basic)
  • Can you discuss a piece of music you composed that you are particularly proud of? (medium)
  • How do you handle creative blocks or moments of inspiration? (medium)
  • What is your experience working with live musicians or orchestras for recording sessions? (advanced)
  • How do you approach negotiating fees or contracts for your composition work? (medium)
  • Can you discuss a project where you had to compose music for a specific cultural or historical context? (advanced)
  • How do you ensure that your music is original and does not infringe on copyright laws? (medium)
  • Have you ever had to rework a composition multiple times based on client feedback? How did you handle that situation? (medium)
  • What do you think sets your composing style apart from others in the industry? (medium)
  • How do you approach creating a memorable and impactful musical theme for a project? (medium)
  • Can you discuss a project where you had to compose music for a non-traditional or experimental medium? (advanced)
  • How do you balance artistic integrity with meeting the client's expectations and requirements? (medium)
  • What is your process for creating a soundtrack that enhances the emotional impact of a scene in a film or game? (medium)
  • Can you discuss a time when you had to compose music that evoked a specific mood or atmosphere? (medium)
  • How do you approach collaborating with sound designers or audio engineers to enhance the overall sound of a project? (medium)

Closing Remark

As you prepare for composer roles in India, remember to showcase your unique talents and passion for music in your portfolio and interviews. With dedication and creativity, you can pursue a rewarding career in composing and contribute to the vibrant entertainment industry in India. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies