Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
175.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. We are seeking a visionary and highly motivated Vice President of Engineering to lead our Big Data Application Services organization. In this pivotal role, you will guide the engineering, solutions, and central replatforming teams, driving the transformation of our data engineering capabilities and the evolution toward our next gen big data platform. The ideal candidate is a strategic and innovative leader with a strong engineering background, deep expertise in big data technologies-especially on Google Cloud Platform (GCP), and a passion for advancing data observability, replatform acceleration, and cutting-edge data engineering solutions. How will you make an impact in this role? Define and execute a strategic roadmap for the Big Data Application Services, aligning with overall business objective and leading the transition to next-generation big-data platform Lead, mentor, and grow high-performing engineering, solutions and replatforming teams. Foster a culture of innovation, collaboration, engineering excellence, and continuous learning Drive the architecture, design and implementation of scalable, reliable and optimized data engineering capabilities and practices on Google Cloud Platform Champion best practices in software engineering, data engineering and operations within the big data ecosystem. Implement comprehensive data observability strategies and tools to ensure data quality, reliability and performance across the platform. Oversee the development and deployment of advanced data engineering and transformation capabilities, enabling sophisticated data analysis, business intelligence and machine learning use cases. Partner with stakeholders to define and deliver enterprise-wide impactful data solutions. Work closely with product management, federated data teams, data science, analytics teams, and other business units to understand data needs and deliver solutions that drive business value. Work with peers, Principal Engineers and Principal Architects to assimilate emerging trends and technologies in the big data, cloud, and analytics landscape. Evaluate and introduce new tools and technologies to enhance platform capabilities and drive innovation. Excellent leadership and interpersonal skills, with the ability to influence at all levels across functions, from both technical and non-technical perspectives alike; able to lead business and technology conversations with SVP and/or EVP level business leaders Qualifications 15+ years of large-scale technology delivery and formal management in a complex environment and/or comparable experience. With at least 7 years of experience in Big Data Technology delivery. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Demonstrated ability to lead, inspire, and manage multi-disciplinary engineering teams in a fast-paced, global environment. Deep understanding and hands-on experience in designing and implementing secure, scalable, and cost effective GCP Big Data Services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Composer Airflow, Cloud Storage) Strong knowledge of data observability principles, frameworks, and tools. Experience in implementing solutions for data monitoring, logging, tracing, lineage, and quality alerting. Proven experience leading large-scale data platform modernization and replatforming initiatives. Experienced with replatforming tools and accelerators to streamline migration and reduce risk. Expertise in designing and implementing robust and scalable ETL/ELT pipelines and data transformation. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
Hosur, Tamil Nadu
On-site
Responsibilities: Design and create visually stunning graphics for both print and digital platforms, adhering to brand guidelines and project requirements. Collaborate with the marketing and creative teams to develop innovative and effective design concepts. Produce high-quality designs that effectively communicate the desired message and objectives. Generate ideas and concepts to enhance visual communication strategies. Create and edit images, illustrations, and layouts using industry-standard software tools. Stay updated with the latest design trends, techniques, and technologies to bring fresh ideas to the team. Manage multiple projects simultaneously and meet deadlines while maintaining a high level of attention to detail. Proven experience as a Video Editor, with a strong portfolio demonstrating your editing skills and creativity. Proficiency in video editing software such as Adobe Premiere Pro, Final Cut Pro, or Avid Media Composer. Solid understanding of video editing principles, including pacing, storytelling, and visual aesthetics. Experience with motion graphics and visual effects software (e.g., Adobe After Effects) is a plus. Strong attention to detail, with the ability to maintain consistency in video quality and style. Excellent knowledge of video formats, codecs, and compression techniques. Ability to work independently and collaboratively in a fast-paced environment. Effective communication skills to collaborate with team members and clients. Job Type: Full-time Pay: ₹15K - ₹35K per month Job Type: Full-time Pay: ₹15,000.00 - ₹35,000.00 per month Benefits: Commuter assistance Flexible schedule Food provided Leave encashment Paid sick time Paid time off Schedule: Day shift Fixed shift Weekend availability Supplemental Pay: Overtime pay Yearly bonus Ability to commute/relocate: Hosur, Tamil Nadu: Reliably commute or planning to relocate before starting work (Preferred) Experience: Video editing: 1 year (Required) Creative Designing: 2 years (Required) Photography: 1 year (Required) Adobe Photoshop: 2 years (Required) Adobe Premiere: 2 years (Required) Work Location: In person
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Kyndryl Data Science Bengaluru, Karnataka, India Chennai, Tamil Nadu, India Posted on Jun 9, 2025 Apply now Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As GCP Data Engineer at Kyndryl, you will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment using GCP data services. You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. Responsibilities Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and maintain Python / PySpark for data processing and integrate with GCP services for seamless data operations. Develop and optimize SQL queries for data analysis and reporting. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Implement data governance and security best practices within GCP. Perform data quality checks and validation to ensure accuracy and consistency. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Provide technical support and guidance to junior data engineers and other team members. Participate in code reviews and contribute to continuous improvement of data engineering practices. Implement best practices for cost management and resource utilization within GCP. If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical And Professional Experience Bachelor’s or master’s degree in computer science, Engineering, or a related field with over 8 years of experience in data engineering More than 3 years of experience with the GCP data ecosystem Hands-on experience and Strong proficiency in GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion. Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in PySpark and/or Python, specifically for building cloud-native data pipelines. Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. Knowledge of data governance, security, and compliance best practices. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Ability to work independently and in agile teams. Preferred Technical And Professional Experience GCP Data Engineer Certification is highly preferred. Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working as a Data Engineer and/or in cloud modernization. Knowledge of Databricks, Snowflake, for data analytics. Experience in NoSQL databases Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with BI dashboards and Google Data Studio is a plus. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address. Apply now See more open positions at Kyndryl Show more Show less
Posted 1 week ago
4.0 - 5.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
About Sun Pharma: Sun Pharmaceutical Industries Ltd. (Sun Pharma) is the fourth largest specialty generic pharmaceutical company in the world with global revenues of US$ 5.4 billion. Supported by 43 manufacturing facilities, we provide high-quality, affordable medicines, trusted by healthcare professionals and patients, to more than 100 countries across the globe. Job Summary EDMS Development and Configuration specialist will be responsible for the successful development, deployment, configuration, and ongoing support of EDMS 21.2. This role requires a deep understanding of EDMS LSQM workflows, strong technical skills, and the ability to work closely with cross-functional teams to ensure the EDMS meets the needs of the organization. Roles and Responsibilities • Assist in the development and maintenance of Documentum D2 LSQM application, including custom workflows and document management solutions. • Collaborate with senior developers to understand requirements and translate them into technical specifications. • Support the testing and debugging of Documentum applications to ensure high-quality output and performance. • Document development processes and maintain accurate technical documentation. • Solid understanding of content management principles and best practices, with experience in implementing Documentum solutions in enterprise environments. • Familiarity with Java, SQL, and web services integration for developing Documentum applications. • Expertise in Documentum platform and its components, including Documentum Content Server and Documentum Webtop. • Proficiency in using development tools such as Documentum Composer and Documentum Administrator. • Experience with version control systems (e.g., Git) and agile development methodologies. Qualifications and Preferences Qualifications: • Bachelor's degree in Information Technology, or a related field. • Minimum of 4-5 years of experience in EDMS LSQM configuration, preferably in a pharmaceutical or biotech environment. • Strong understanding of Category 1, Category 2 & 3 workflows. • Proficiency in Documentum LSQM software. • Ability to manage multiple tasks and projects simultaneously. • Strong analytical and problem-solving skills. • Excellent communication and interpersonal skills. Prefereed Qualifications: • Advanced degree in Information Technology or a related field. • Experience with database management and DQL. • Understanding of Documentum Content Server and its APIs. • Familiarity with Documentum DQL (Documentum Query Language). • Experience in Documentum development, including proficiency in Documentum Foundation Classes (DFC) and Documentum Query Language (DQL). • Basic knowledge of RESTful services and web development principles. Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on Sun Pharma Candidates selected after the screening rounds will be processed further by Sun Pharma Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job description: Overall more than 5 Yrs of experience in Data projects Good Knowledge of GCP Bigquery SQL Python dataflow skills Has worked in Implementation projects building data pipelines transformation logics and data models Job Title GCP Data Engineer Belongs to Data Management Engineering Education Bachelor of engineering in any disciplineequivalent Desired Candidate Profile Technology Engineering Expertise 4 years of experience in implementing data solutions using GCP BigquerySQL programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Key Responsibilities Implement data solutions using GCP and need to be familiar in programming with SQLpython Ensure clarity on NFR and implement these requirements Work with Client Technical Manager by understanding customers landscape their IT priorities Lead performance engineering and capacity planning exercises for databases Technology Engineering Expertise 4 years of experience in implementing data pipelines for Data Analytics solutions Experience in solutions using Google Cloud Data Flow Apache Beam Java programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Experience with different development methodologies RUP Scrum XP Soft skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Skills: Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description · Hands-on experience with data tools and technologies is a must. ·Partial design experience is acceptable, but core focus should be on strong data skills. ·Will be supporting the pre-sales team from a hands-on technical perspective. · GCP experience: Looker / BigQuery / Vertex – any of these with 6 months to 1 year of experience. Requirements On day one we'll expect you to... Own the modules and take complete ownership of the project Understand the scope, design and business objective of the project and articulate it in the form of a design document Strong experience with Google Cloud Platform data services, including BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, GenAI (Gemini, Imagen, Veo) Experience in implementing data governance on GCP Familiarity with integrating GCP services with other platforms like Snowflake, and hands-on Snowflake project experience is a plus Experienced coder in python, SQL, ETL and orchestration tools Experience with containerized solutions using Google Kubernetes Engine Good communication skills to interact with internal teams and customers Expertise in pySpark(Batch and Real-time both), Kafka, SQL, Data Querying tools. Experience in working with a team, continuously monitoring, working as a individual contributor hand-on and helping team deliver their work as you deliver yours Experience in working with large volumes of data in distributed environment keeping in mind parallelism and concurrency, ensuring performant and resilient system ops Optimize the deployment architecture to reduce job run-times and resource utilization Develop and Optimize Data Warehouses given the schema design. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru
On-site
Bengaluru, Karnataka Job ID JR2025454584 Category Information Technology Post Date Jun. 08, 2025 Job Description At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. As a leading global aerospace company, Boeing develops, manufactures and services commercial airplanes, defense products and space systems for customers in more than 150 countries. As a top U.S. exporter, the company leverages the talents of a global supplier base to advance economic opportunity, sustainability and community impact. Boeing’s team is committed to innovating for the future, leading with sustainability, and cultivating a culture based on the company’s core values of safety, quality and integrity. Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts – enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping people’s careers and being thoughtful about employee wellbeing. With us, you can create and contribute to what matters most in your career, community, country, and world. Join us in powering the progress of global aerospace. About Position / Position Summary The Boeing India IT&DA Engineering Division team is currently looking for a PLM Developer - 3DX to join their team in Bangalore, India. Responsibilities include the development and integration of a variety of PLM tools and in-house software applications supporting our engineering teams. The position will require strong skills in the PLM domain and PLM tools, specifically products like ENOVIA, DELMIA, CATIA, and PROCESS COMPOSER, as well as development and customization in the 3DExperience platform. The successful candidate will be responsible for the software development and customization of the PLM tools, primarily on the 3DExperience platform . This role will be based out of Bengaluru, India. Position Responsibilities: The position will require development and customization of PLM tools on the 3DExperience platform, ensuring optimal system performance through installation and administration. Responsibilities include programming in C++, C#, Python, and ASP.NET, as well as creating dashboard widgets and scripts. The role involves analyzing specifications, documenting designs, and testing/debugging software. Collaboration with cross-functional teams is essential, along with effective problem resolution and innovation in development processes. Familiarity with Agile practices and tools like GIT and Azure DevOps is preferred. A self-starter attitude and strong communication skills are crucial for success in this position. Employer will not sponsor applicants for employment visa status. Basic Qualifications (Required Skills/Experience): Experience with 3DX installation/configuration and system administration functionalities . Experience with 3DX DevAppSuite (CAA), EKL and VB/.Net automation . Hands-on experience in C++, C#, Python, ASP.NET, and shell scripting . Develop Java Program Objects (JPOs) and Tcl/Tk scripts to implement triggers and clean up existing migrated database information . Experience with dashboard widget creation/customization . Good functional knowledge of Catia V5 and 3DExperience . Develop design documents based on functional specifications and requirements in a concise manner . Analyze, design, code, test, and debug existing and new programs to support the customization of the enterprise-level 3DX PLM . Must have excellent debugging and problem-solving skills . Develop software integrations between the 3DExperience platform and existing systems using web services and other related development tools . Must have experience working with cross-functional teams spread across multiple products and locations . Strong written and verbal communication skills are required . Candidate must be a self-starter with a positive attitude, high ethics, and a track record of working successfully under pressure in a time-constrained environment . Effectively resolve problems and roadblocks as they occur, consistently following through on details while driving innovation and issue resolution . Preferred Qualifications (Desired Skills/Experience) : Functional experience in product development within the aerospace/automotive domain . Experience in web applications using Tomcat, HTML, JavaScript, and J2EE technologies . Knowledge of data structures and design patterns . Familiarity with tools such as Coverity, GIT and Azure DevOps . Knowledge of Agile development practices . Good to have skills in public cloud Azure . Typical Education & Experience: Education/experience typically acquired through advanced education (e.g. Bachelor) and typically 4 Plus years' related work experience or Master’s Degree with 5+ years of experience with an equivalent combination of education and experience Relocation: This position does offer relocation within INDIA. Applications for this position will be accepted until Jun. 16, 2025 Export Control Requirements: This is not an Export Control position. Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India) Equal Opportunity Employer: We are an equal opportunity employer. We do not accept unlawful discrimination in our recruitment or employment practices on any grounds including but not limited to; race, color, ethnicity, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military and veteran status, or other characteristics covered by applicable law. We have teams in more than 65 countries, and each person plays a role in helping us become one of the world’s most innovative, diverse and inclusive companies. We are proud members of the Valuable 500 and welcome applications from candidates with disabilities. Applicants are encouraged to share with our recruitment team any accommodations required during the recruitment process. Accommodations may include but are not limited to: conducting interviews in accessible locations that accommodate mobility needs, encouraging candidates to bring and use any existing assistive technology such as screen readers and offering flexible interview formats such as virtual or phone interviews.
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad
On-site
Position: AOSP Streaming Developer (CE710SI RM 3292) Shift timing : Regular Work Mode : Client Office (5 Days) Education : BE/B. Tech Relevant experience : 5+ Years Must have skills: Android, Streaming, C Job Description: We are seeking a proactive and enthusiastic individual to join our team as an Streaming/Airmedia Software Development Engineer. This role offers a unique opportunity to work on cutting-edge technologies and contribute to the development of video conferencing and streaming solutions. Responsibilities / Job Descriptions 1. Daily activities will include the full software development life-cycle of design, develop, modify, test, debug, and support. 2. Work closely with other engineers. Technical skills required : Proficiency in programming languages (C/C++ or Java) Communication networks and protocols (Ethernet, TCP/UDP/IP, etc.). Experience with multimedia frameworks (gstreamer) and streaming protocols (e.g., RTP, RTSP, HLS, MPEG-DASH) . Experience with Android’s graphic stack including SufaceFlinger, Hardware Composer and BufferQueue. Experience with video rendering framework like MediaCodec, OpenGL ES or Vulkan. Experience with Android’s HAL, HIDL/AIDL layer and Treble-compliant system designs. Strong debugging and problem-solving skills Excellent communication and interpersonal skills Meticulous attention to detail and strong organizational sense Motivated with the ability to work independently as well as part of a team ******************************************************************************************************************************************* Job Category: Embedded HW_SW Job Type: Full Time Job Location: AhmedabadBangaloreHyderabadPune Experience: 7-10 Years Notice period: 0-15 daysImmediate
Posted 1 week ago
3.0 years
4 - 10 Lacs
India
On-site
Job Description: Embedded Software Engineer Our company is looking for a skilled embedded software engineer to join our team. As a key team member, you will be crucial in designing, developing, and testing embedded software, including coding, debugging, testing, troubleshooting, and documenting. Candidates with solid software design skills and a commitment to innovation would be preferred for the role. If you’re a talented and innovative engineer with a passion for developing software solutions and have a proven track record in embedded systems, firmware development, and a strong understanding of hardware-software integration, we invite you to apply. We offer competitive compensation, a modern work environment, and opportunities for professional growth. Objectives of this role Developing and implementing embedded software solutions for various systems for Power Electronics as per client needs. Collaborating with cross-functional teams, including hardware engineers, to define software requirements and specifications. Conducting feasibility studies and system analysis to ensure software compatibility with hardware components. Writing and optimizing efficient, reusable and scalable embedded code. Performing unit testing and debugging to ensure software functionality and reliability. Contributing to the documentation and maintenance of software applications. Your tasks Design and develop embedded software solutions for microcontrollers and microprocessors such as STM32, PIC, Texas, Arduino. Collaborate with hardware engineers to define software requirements and specifications. Write efficient, modular and well-documented code in C/C++ and/or Python. Work with testing teams to ensure software meets quality standards. Implement software updates and patches based on feedback and testing results. Debugging and troubleshooting software to identify and resolve issues. Conducting code reviews and ensuring compliance with coding standards. Stay updated on industry advancements in embedded software development. Required skills and qualifications Bachelor’s degree in Electrical Engineering, Electronics Engineering, Computer Engineering or a related field. 3+ years of experience as an embedded software engineer. Proficiency in programming languages such as C and C++ for embedded systems and basic knowledge of Python. Experience with microcontrollers, microprocessors such as STM32, PIC, Texas. Experience with using software development tools such as STM32CubeIDE, Code Composer Studio or any related tool. Detail-oriented with excellent problem-solving and analytical skills for Power Electronic Systems. Must have good knowledge of Power Electronics systems such as DC-DC Converters, Power Supplies. Knowledge of communication protocols (e.g., SPI, I2C, UART, CAN) and device drivers. Good to have experience in IoT protocol such as MQTT and cloud platform (AWS, Azure or GCP). Good to have experience using Version control such as GitHub, Bitbucket. Preferred skills and qualifications Master or advanced degree in Electrical Engineering, Electronics Engineering, Computer Engineering, or a related field. Familiarity with software version control systems (e.g., Git). Certifications in embedded systems, Power Electronics Systems, IoT applications, or related areas. Familiarity with software development tools and version control systems. Experience with agile software development and embedded system security. Statcon Electronics is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, age, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Statcon Electronics is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit www.sindia.co.in. Job Type: Full-time Pay: ₹400,000.00 - ₹1,000,000.00 per year Schedule: Day shift Work Location: In person
Posted 1 week ago
55.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Company Description TPF Engineering Pvt. Ltd. is a civil engineering consultancy firm based in Navi Mumbai., specializing in infrastructure design and consultancy for over 55 years. We provide sustainable infrastructure advisory services and strive to be recognized as the best engineering consultant in the market. As a subsidiary of TPF S.A., a Belgian multi-national organization with over 4000 collaborators worldwide, we are committed to excellence in engineering. Role Description This is a full-time on-site role for a Highway Design Engineer (Core Highway Design ) located in Vashi, Navi Mumbai. The Highway Design Engineer will be responsible for tasks such as highway design, roadway design, drainage design, and transportation planning. Qualifications Master's degree in Transportation Engineering or related field Min 3 to 5 years of experience in core highway design Minimum work experience of a completed (either partially – 50% or fully – 100%) Detailed Design project / DPR works with focus on designing aspects Experience working on Proof checking, Authority/Independent Engineering projects. Lead and manage the geometric design of highways, expressways, and rural roads per IRC and MoRTH standards. Proficiency in AutoCAD Civil 3D (Plan and profile drawings, Corridor Modelling, Multiple cross-sections, Earthwork, Sub-assembly composer, etc.), KGPBack, IITPave, etc. Strong knowledge of highway design standards (IRC, or other relevant codes). Experience with pavement design, drainage design-analysis, and traffic engineering. Familiarity with construction materials, methods, and best practices. •Coordinate with multidisciplinary teams including structural, geotechnical, surveyors and environmental engineers, etc. •Mentor junior engineers and review their design deliverables. •Interact with clients, government agencies (like MoRTH, NHAI, PWD, etc.), and contractors. •Prepare documentation for DPRs, feasibility studies, and technical presentations. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : React.js, Cloud Network Operations Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and implement scalable applications using Google BigQuery. - Collaborate with cross-functional teams to ensure application functionality. - Conduct code reviews and provide technical guidance to junior developers. - Stay updated on industry trends and best practices in application development. - Troubleshoot and resolve application issues in a timely manner. Professional & Technical Skills: (Project specific) - BQ, BQ Geospatial, Python, Dataflow, Composer , - Secondary skill -Geospatial Domain Knowledge" - Must To Have Skills: Proficiency in Google BigQuery. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Bengaluru office. - A 15 years full-time education is required. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
Mulshi, Maharashtra, India
On-site
Area(s) of responsibility Cloud Technical – Data Migration 1 A minimum of 10-14 years’ experience as a Oracle Cloud technical development role with prior Techno Functional experience in Oracle EBS 2 Sound knowledge of Oracle SaaS cloud data migrations & inbound Integrations using File Based Data Import (FBDI), FBDI Automation using Oracle OIC, Inbound SOAP Web Services, Inbound REST APIs, ADFdi Spreadsheet Data Loader, Import File using UCM Web Service 3 Hands on experience in Oracle Cloud Reporting Tools like BI Publisher (BIP), BIP Bursting, Secure BIP Report, Oracle Transactional Business Intelligence (OTBI), OTBI Analysis, Dashboards, Drill Down Report in OTBI, Use OTBI Analysis in BIP, Secure OTBI Report 4 Working knowledge of ESS Jobs Submission and Scheduling, Create Custom ESS Job, Parametrize ESS Job, LOV with Lookup and Value Sets, Secure ESS Job 5 Exposure to Extensions & Customizations - Sandboxes, Create Infolet, Customize Standard Fusion UI/Page, Integrate External Application, Application Composer, Migrate Customizations 6 OIC Integration to import bulk data using FBDI is a plus 7 Design, develop and support integrations in OIC to Oracle ERP Cloud including extracting Oracle ERP Cloud data using BI Publisher reports, analysis and OTBI Reports 8 Provide hands-on technical and development support for implemented Oracle ERP Cloud modules 9 Fusion Cloud Security experience like Security Console, Manage Users & Roles, Role Provisioning and Data Access 10 Knowledge of Oracle Interface tables in financial and procurement modules. 11 Hands-On Experience of XSLT Show more Show less
Posted 1 week ago
20.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Company Overview iFlair Web Technologies Pvt. Ltd. is a premier software development company with expertise in web, mobile, and e-commerce technologies. With over 20 years of industry experience, we have successfully delivered 3500+ projects to global clients. Our solutions are tailored to meet business needs and enhance customer experiences. (https://www.iflair.com) Job Summary We are looking for a highly skilled and experienced Technical Team Leader with deep expertise in Laravel and PHP frameworks. The ideal candidate will have a minimum of 5 years of experience in web development and will be responsible for leading a team of developers, ensuring code quality, and driving the technical success of projects. Experience or exposure to mobile application development will be an added advantage. Key Responsibilities · Lead and manage a team of Laravel/PHP developers to deliver high-quality projects. · Design and develop scalable web applications using the Laravel framework. · Architect robust, secure, and scalable PHP-based applications. · Conduct code reviews, mentor team members, and enforce best practices in development. · Collaborate with project managers, designers, and other teams to ensure smooth project delivery. · Manage project timelines, risks, and resource planning. · Troubleshoot and debug complex technical issues. · Stay up-to-date with Laravel and PHP trends, tools, and practices. · Ensure documentation and technical specifications are maintained. · Make informed decisions in the best interest of project execution and work independently when needed. Required Qualifications · Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. · Minimum 5 years of experience in PHP and Laravel development. · Strong understanding of OOP principles, MVC architecture, and RESTful API development. · Proficient in MySQL, HTML, CSS, JavaScript, and modern front-end frameworks (e.g., Vue.js or React). · Experience in using Git, Composer, and other development tools. · Excellent problem-solving, debugging, and analytical skills. · Strong leadership and communication abilities. Preferred Skills · Experience in mobile application development or working alongside mobile teams is a strong advantage. · Experience with Agile methodologies and tools like JIRA. · Familiarity with CI/CD pipelines and deployment strategies. · Knowledge of cloud services such as AWS or DigitalOcean. · Experience in handling client communication and technical presentations. What We Offer · Opportunity to work on challenging and innovative projects. · Flexible working hours and a hybrid work culture. · Continuous learning and career development programs. · Competitive salary and performance-based incentives. · Health insurance and other employee benefits. · Positive and collaborative work environment. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune
Remote
Role: director and Screenwriters Building Stories with Technology Responsibilities: Build and manage Teams Attend casting sessions and selecting actors Interpret scripts and understand the story and narrative style Oversee rehearsals to ensure actors understand your artistic vision Identify set locations for different scenes in the film Work within budgetary constraints when needed Adhere to a production schedule to ensure the film is completed on time Coordinate with a camera crew, art directors, costume designers and musical composer to ensure a consistent creative execution Job Type: Internship Contract length: 3 months Pay: From ₹1,000.00 per month Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Education: Bachelor's (Preferred) Location: Pune, Maharashtra (Preferred) Work Location: Hybrid remote in Pune, Maharashtra
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: GCP Data Engineer, 5+ yrs, {BigQuery, Composer, Cloud Functions, Python, SQL}, Certification optional. Mandatory skill sets: GCP Data Engineer, 5+ yrs, {BigQuery, Composer, Cloud Functions, Python, SQL}, Certification optional. Preferred skill sets: GCP Data Engineer, 5+ yrs, {BigQuery, Composer, Cloud Functions, Python, SQL}, Certification optional. Years of experience required: 5-8 years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP), Structured Query Language (SQL) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
0.0 years
0 Lacs
Pune, Maharashtra
Remote
Role: director and Screenwriters Building Stories with Technology Responsibilities: Build and manage Teams Attend casting sessions and selecting actors Interpret scripts and understand the story and narrative style Oversee rehearsals to ensure actors understand your artistic vision Identify set locations for different scenes in the film Work within budgetary constraints when needed Adhere to a production schedule to ensure the film is completed on time Coordinate with a camera crew, art directors, costume designers and musical composer to ensure a consistent creative execution Job Type: Internship Contract length: 3 months Pay: From ₹1,000.00 per month Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Education: Bachelor's (Preferred) Location: Pune, Maharashtra (Preferred) Work Location: Hybrid remote in Pune, Maharashtra
Posted 1 week ago
4.0 - 5.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
About Sun Pharma: Sun Pharmaceutical Industries Ltd. (Sun Pharma) is the fourth largest specialty generic pharmaceutical company in the world with global revenues of US$ 5.4 billion. Supported by 43 manufacturing facilities, we provide high-quality, affordable medicines, trusted by healthcare professionals and patients, to more than 100 countries across the globe. Job Summary EDMS Development and Configuration specialist will be responsible for the successful development, deployment, configuration, and ongoing support of EDMS 21.2. This role requires a deep understanding of EDMS LSQM workflows, strong technical skills, and the ability to work closely with cross-functional teams to ensure the EDMS meets the needs of the organization. Roles and Responsibilities • Assist in the development and maintenance of Documentum D2 LSQM application, including custom workflows and document management solutions. • Collaborate with senior developers to understand requirements and translate them into technical specifications. • Support the testing and debugging of Documentum applications to ensure high-quality output and performance. • Document development processes and maintain accurate technical documentation. • Solid understanding of content management principles and best practices, with experience in implementing Documentum solutions in enterprise environments. • Familiarity with Java, SQL, and web services integration for developing Documentum applications. • Expertise in Documentum platform and its components, including Documentum Content Server and Documentum Webtop. • Proficiency in using development tools such as Documentum Composer and Documentum Administrator. • Experience with version control systems (e.g., Git) and agile development methodologies. Qualifications and Preferences Qualifications: • Bachelor's degree in Information Technology, or a related field. • Minimum of 4-5 years of experience in EDMS LSQM configuration, preferably in a pharmaceutical or biotech environment. • Strong understanding of Category 1, Category 2 & 3 workflows. • Proficiency in Documentum LSQM software. • Ability to manage multiple tasks and projects simultaneously. • Strong analytical and problem-solving skills. • Excellent communication and interpersonal skills. Prefereed Qualifications: • Advanced degree in Information Technology or a related field. • Experience with database management and DQL. • Understanding of Documentum Content Server and its APIs. • Familiarity with Documentum DQL (Documentum Query Language). • Experience in Documentum development, including proficiency in Documentum Foundation Classes (DFC) and Documentum Query Language (DQL). • Basic knowledge of RESTful services and web development principles. Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on Sun Pharma Candidates selected after the screening rounds will be processed further by Sun Pharma Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree or four or more years of work experience. Four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in any DBMS Experience in Shell scripting, Spark, Scala. Experience in GCP/BigQuery, composer, Airflow. Experience in real time streaming Experience in DevOps Even better if you have if you have one or more of the following… Three or more years of relevant experience. Any relevant Certification on ETL/ELT developer. Certification in GCP-Data Engineer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Experience in driving a small team of 2 or more members for technical delivery #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have… Bachelor’s degree or four or more years of work experience. Four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in any DBMS Experience in Shell scripting, Spark, Scala. Experience in GCP/BigQuery, composer, Airflow. Experience in real time streaming Experience in DevOps Even better if you have if you have one or more of the following… Three or more years of relevant experience. Any relevant Certification on ETL/ELT developer. Certification in GCP-Data Engineer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Experience in driving a small team of 2 or more members for technical delivery #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are seeking a highly skilled and motivated Lead DS/ML engineer to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. We are seeking a highly skilled Data Scientist / ML Engineer with a strong foundation in data engineering (ELT, data pipelines) and advanced machine learning to develop and deploy sophisticated models. The role focuses on building scalable data pipelines, developing ML models, and deploying solutions in production to support a cutting-edge reporting, insights, and recommendations platform for measuring and optimizing online marketing campaigns. The ideal candidate should be comfortable working across data engineering, ML model lifecycle, and cloud-native technologies. Job Description: Key Responsibilities: Data Engineering & Pipeline Development Design, build, and maintain scalable ELT pipelines for ingesting, transforming, and processing large-scale marketing campaign data. Ensure high data quality, integrity, and governance using orchestration tools like Apache Airflow, Google Cloud Composer, or Prefect. Optimize data storage, retrieval, and processing using BigQuery, Dataflow, and Spark for both batch and real-time workloads. Implement data modeling and feature engineering for ML use cases. Machine Learning Model Development & Validation Develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experiment with different algorithms (regression, classification, clustering, reinforcement learning) to drive insights and recommendations. Leverage NLP, time-series forecasting, and causal inference models to improve campaign attribution and performance analysis. Optimize models for scalability, efficiency, and interpretability. MLOps & Model Deployment Deploy and monitor ML models in production using tools such as Vertex AI, MLflow, Kubeflow, or TensorFlow Serving. Implement CI/CD pipelines for ML models, ensuring seamless updates and retraining. Develop real-time inference solutions and integrate ML models into BI dashboards and reporting platforms. Cloud & Infrastructure Optimization Design cloud-native data processing solutions on Google Cloud Platform (GCP), leveraging services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow. Work on containerized deployment (Docker, Kubernetes) for scalable model inference. Implement cost-efficient, serverless data solutions where applicable. Business Impact & Cross-functional Collaboration Work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translate complex model insights into actionable business recommendations. Present findings and performance metrics to both technical and non-technical stakeholders. Qualifications & Skills: Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or a related field. Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: Experience: 5-10 years with the mentioned skillset & relevant hands-on experience Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: Experience with Graph ML, reinforcement learning, or causal inference modeling. Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. Experience with distributed computing frameworks (Spark, Dask, Ray). Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Rajarhat, West Bengal, India
On-site
We are seeking a skilled Drupal Developer with 3-5 years of experience to join our growing team in Kolkata. In this role, you will design, develop, and maintain Drupal-based websites and applications, collaborating with cross-functional teams to deliver high-quality digital solutions. Experience Level: 3-5 years Location: Kolkata, India Requirements 3-5 years of professional experience with Drupal development (Drupal 8/9/10) Strong PHP programming skills Proficiency in front-end technologies (HTML, CSS, JavaScript) Experience with Drupal theming (Twig) Familiarity with Drupal module development Experience with Composer, Drush, and Git Working knowledge of MySQL/MariaDB Understanding of responsive design principles Demonstrate proficiency in leveraging AI-assisted development tools to enhance coding efficiency and problem-solving capabilities. Responsibilities Develop and maintain websites and web applications using Drupal CMS Design and implement custom Drupal modules and themes Configure and optimize Drupal sites for performance, security, and scalability Perform Drupal version upgrades and migrations Integrate third-party applications and APIs with Drupal Troubleshoot and resolve site issues and bugs Collaborate with designers, project managers, and other developers Document technical specifications and development processes Stay current with Drupal best practices and emerging technologies Preferred Skills Acquia or Drupal certification Experience with agile development methodologies Familiarity with DevOps practices Knowledge of RESTful APIs Experience with e-commerce platforms (Drupal Commerce) Proficiency with CSS preprocessors (SASS/LESS) Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Roorkee, Uttarakhand, India
Remote
Company Description Miratech helps visionaries change the world. We are a global IT services and consulting company that brings together enterprise and start-up innovation. Today, we support digital transformation for some of the world's largest enterprises. By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. We are a values-driven organization and our culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed by meeting or exceeding our scope, schedule, and/or budget objectives since our inception in 1989. Miratech has coverage across 5 continents and operates in over 25 countries around the world. Miratech retains nearly 1000 full-time professionals, and our annual growth rate exceeds 25%. Job Description Miratech as a trusted partner seeks a CCAI BOT Developer to join our team remotely. This project focuses on developing and implementing advanced conversational AI solutions using the Google CCAI Bot framework. Scrum teams, including IVR and chatbot developers, collaborate to build intelligent voice bots and chatbots that enhance customer interactions in contact centers. The project integrates NLP, NLU, and machine learning technologies with backend systems, databases, and APIs to create scalable, high-performance solutions. It utilizes CI/CD pipelines, agile methodologies, and enterprise-scale technologies like Google Dialogflow, Genesys, and Nuance Mix Tools. Developers also work with REST-based microservices and automated testing to ensure reliability and continuous improvement of the chatbot ecosystem. Responsibilities: Design, develop, and deploy chatbots and voicebots using leading Conversational AI platforms such as Microsoft Bot Framework and Google Dialogflow. Write clean, efficient, and maintainable code following industry best practices and standards. Develop custom components and tools to enhance chatbot functionality, performance, and user experience. Collaborate with cross-functional teams, including developers, designers, and stakeholders, to align chatbot solutions with project goals and user needs. Utilize NLP and ML techniques, including TTS, STT, and SSML, to enable intelligent and context-aware chatbot interactions. Integrate chatbot systems with backend infrastructure, databases, and APIs to ensure seamless data flow and interaction. Troubleshoot and resolve technical issues by analyzing logs, debugging code, and implementing continuous improvements. Stay updated with emerging trends and advancements in chatbot development, AI, and Conversational UI technologies. Qualifications 4+ years of experience with the Google CCAI Bot framework, Dialogflow ES/CX, and Conversational AI technologies, including NLP, NLU, and ML. 4+ years of experience in IVR application development, including Nuance grammar development, GRAT, GRE. Expertise in web services integration, including working with SQL databases, relational databases, and RESTful APIs. Experience with Google, Genesys, and related technologies, including GVP, Nuance Mix Tools, and Genesys Composer. Hands-on experience with Git, Jenkins, Maven, and automated testing methodologies. Strong understanding of agile development and Scrum best practices. Strong analytical skills for resolving technical issues in complex, distributed environments. Experience with the Spring framework and familiarity with Tomcat or similar web servers. Bachelor’s degree in a technology-related field or equivalent professional experience. We offer: Culture of Relentless Performance: join an unstoppable technology development team with a 99% project success rate and more than 30% year-over-year revenue growth. Competitive Pay and Benefits: enjoy a comprehensive compensation and benefits package, including health insurance, language courses, and a relocation program. Work From Anywhere Culture: make the most of the flexibility that comes with remote work. Growth Mindset: reap the benefits of a range of professional development opportunities, including certification programs, mentorship and talent investment programs, internal mobility and internship opportunities. Global Impact: collaborate on impactful projects for top global clients and shape the future of industries. Welcoming Multicultural Environment: be a part of a dynamic, global team and thrive in an inclusive and supportive work environment with open communication and regular team-building company social events. Social Sustainability Values: join our sustainable business practices focused on five pillars, including IT education, community empowerment, fair operating practices, environmental sustainability, and gender equality. Miratech is an equal opportunity employer and does not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other protected status under applicable law. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery Implement and enforce data quality checks, validation rules, and monitoring Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL Document pipeline designs, data flow diagrams, and operational support procedures Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.) Solid understanding of data pipeline architecture, data integration, and transformation techniques Experience in working with version control systems like GitHub and knowledge of CI/CD practices Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.) Good to Have (Optional Skills): Experience working with Snowflake cloud data platform Hands-on knowledge of Databricks for big data processing and analytics Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Years of Experience: Candidates with 4+ years of hands on experience Position: Senior Associate Industry: Supply Chain/Forecasting/Financial Analytics Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Strong supply chain domain knowledge (inventory planning, demand forecasting, logistics) Well versed and hands-on experience of working on optimization methods like linear programming, mixed integer programming, scheduling optimization. Having understanding of working on third party optimization solvers like Gurobi will be an added advantage Proficiency in forecasting techniques (e.g., Holt-Winters, ARIMA, ARIMAX, SARIMA, SARIMAX, FBProphet, NBeats) and machine learning techniques (supervised and unsupervised) Experience using at least one major cloud platform (AWS, Azure, GCP), such as: AWS: Experience with AWS SageMaker, Redshift, Glue, Lambda, QuickSight Azure: Experience with Azure ML Studio, Synapse Analytics, Data Factory, Power BI GCP: Experience with BigQuery, Vertex AI, Dataflow, Cloud Composer, Looker Experience developing, deploying, and monitoring ML models on cloud infrastructure Expertise in Python, SQL, data orchestration, and cloud-native data tools Hands-on experience with cloud-native data lakes and lakehouses (e.g., Delta Lake, BigLake) Familiarity with infrastructure-as-code (Terraform/CDK) for cloud provisioning Knowledge of visualization tools (PowerBI, Tableau, Looker) integrated with cloud backends Strong command of statistical modeling, testing, and inference Advanced capabilities in data wrangling, transformation, and feature engineering Familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools (e.g., Airflow) Strong communication and stakeholder engagement skills at the executive level Roles And Responsibilities Assist analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has a growing market for composer jobs, with various opportunities available for talented individuals in the music industry. Whether it's creating music for films, television, video games, or other media, composers play a vital role in shaping the overall experience for audiences. If you're considering a career in composing, here's a guide to help you navigate the job market in India.
These cities are known for their vibrant entertainment industries and often have a high demand for composers across various projects.
The average salary range for composer professionals in India can vary depending on experience and expertise. Entry-level composers can expect to earn between INR 3-5 lakhs per year, while experienced composers with a strong portfolio can earn upwards of INR 10 lakhs per year.
In the field of composing, a typical career path may involve starting as a Junior Composer, then progressing to a Composer, Senior Composer, and eventually a Music Director or Lead Composer. As you gain more experience and recognition for your work, you may have the opportunity to work on larger projects and collaborate with well-known artists.
In addition to composing skills, it is beneficial for composers to have a good understanding of music theory, proficiency in music production software, excellent communication skills for collaborating with directors and producers, and the ability to work under tight deadlines.
As you prepare for composer roles in India, remember to showcase your unique talents and passion for music in your portfolio and interviews. With dedication and creativity, you can pursue a rewarding career in composing and contribute to the vibrant entertainment industry in India. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2