Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
Job Description: Creative Content Writer Position: Creative Content Writer (Storyteller & Poet with Social Media Expertise) Location: Noida Sector 62 Job Type: Full-time Salary: Best In Market Overview: We are looking for a highly creative and versatile Creative Content Writer with a strong science background to help bring our brand’s story to life. If you have a passion for storytelling , an affinity for poetry , and a deep understanding of scientific concepts, we want to hear from you! This role blends creativity with science communication, perfect for those who can write engaging, thought-provoking narratives and poems, and craft powerful social media content. As a Creative Content Writer, you will create a range of content—everything from captivating stories, poetic posts, and viral one-liners to science-focused content that resonates on social media platforms. Your writing will bridge the gap between complex scientific ideas and an audience seeking both education and entertainment. Key Responsibilities: Creative Science Storytelling: Write compelling, imaginative stories that explain complex scientific concepts in a clear, engaging, and accessible way. Whether it's for blogs, articles, or social media, your stories should inspire curiosity and excitement about science. Poetry with Purpose: Create original poetry that incorporates scientific themes or ideas. Craft thought-provoking, emotional poems that resonate with audiences and align with our brand’s voice. Social Media Content Creation: Develop creative, attention-grabbing one-liners, two-liners, and captions for social media, blending your knowledge of science with compelling writing to spark engagement across platforms like Instagram, Twitter, Facebook, and LinkedIn. Platform-Specific Content: Adapt your writing to fit the tone and style of each social media platform, ensuring that your posts drive conversations and engagement. Be mindful of character limits, hashtags, and audience preferences for each platform. Trend Integration: Keep up with current scientific trends, viral challenges, and social media shifts. Integrate popular science topics or viral moments into your writing to ensure our content remains relevant and timely. Campaign Support: Collaborate with the marketing and creative teams to design and execute content strategies, integrating scientific knowledge into promotional content, ad copy, and digital campaigns. Audience Engagement: Craft posts and replies that not only inform but also invite interaction, encouraging discussions, comments, and shares among followers. Qualifications: Educational Background: 11th and 12th in a science field (e.g., Biology, Chemistry, Physics, Environmental Science, etc.) is required. This role requires an understanding of scientific principles and the ability to communicate complex topics to a broader audience. Experience: At least 2 years of professional writing experience, particularly in creative writing, storytelling, or poetry. Experience creating engaging science content for social media is a must. Skills: Strong ability to translate complex science into accessible, engaging content . Exceptional writing, editing, and storytelling skills. Ability to create catchy, shareable one-liners, two-liners, and social media captions that blend science with creativity. Familiarity with social media platforms and knowledge of how to tailor content for engagement. Creativity and originality in producing content that is both informative and entertaining. Portfolio: A portfolio showcasing your creative writing, social media content (including science-focused posts, captions, and one-liners), and poetry is required. Any examples of written content related to science will be a plus. Key Traits We Are Looking For: Science Communicator: You have the ability to break down scientific topics into digestible, engaging content that resonates with a wide audience. Creative Innovator: Your creativity doesn’t have limits—you can write compelling stories, poems, and social media content that blends art with science. Social Media Savvy: You are aware of the latest trends, memes, and challenges on social media and can creatively integrate them into science-based content. Collaborative Spirit: You thrive in a team-oriented environment, brainstorming and working closely with marketing, design, and creative teams to ensure content is aligned with overall brand strategy. Emotionally Intelligent Writer: You understand how to connect emotionally with an audience, whether through humor, empathy, or inspiration, and can craft content that reflects this. What We Offer: Creative Freedom: A chance to bring your unique storytelling and scientific knowledge to life. Flexibility: Work remotely or from our office with flexible hours. Collaborative Environment: Be part of a creative team that values innovation, experimentation, and fresh perspectives. Competitive Salary: Receive compensation that reflects your skills, experience, and contributions. Career Growth: Opportunities to advance in a rapidly growing company that values creativity and intellectual curiosity.
Posted 6 days ago
0.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Associate Job Description & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation’s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true saelves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Architecture Design: · Design and implement scalable, secure, and high-performance architectures for Generative AI applications. · Integrate Generative AI models into existing platforms, ensuring compatibility and performance optimization. Model Development and Deployment: · Fine-tune pre-trained generative models for domain-specific use cases. · Data Collection, Sanitization and Data Preparation strategy for Model fine tuning. · Well versed with machine learning algorithms like Supervised, unsupervised and Reinforcement learnings, Deep learning. · Well versed with ML models like Linear regression, Decision trees, Gradient boosting, Random Forest and K-means etc. · Evaluate, select, and deploy appropriate Generative AI frameworks (e.g., PyTorch, TensorFlow, Crew AI, Autogen, Langraph, Agentic code, Agent flow). Innovation and Strategy: · Stay up to date with the latest advancements in Generative AI and recommend innovative applications to solve complex business problems. · Define and execute the AI strategy roadmap, identifying key opportunities for AI transformation. · Good exposure to Agentic Design patterns Collaboration and Leadership: · Collaborate with cross-functional teams, including data scientists, engineers, and business stakeholders. · Mentor and guide team members on AI/ML best practices and architectural decisions. · Should be able to lead a team of data scientists, GenAI engineers and Software Developers. Performance Optimization: · Monitor the performance of deployed AI models and systems, ensuring robustness and accuracy. · Optimize computational costs and infrastructure utilization for large-scale deployments. Ethical and Responsible AI: · Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks. · Implement safeguards to mitigate bias, misuse, and unintended consequences of Generative AI. Mandatory skill sets: · Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. · Experience with machine learning, artificial Intelligence frameworks models and libraries (TensorFlow, PyTorch, Scikit-learn, etc.). · Should have strong knowledge on LLM’s foundational model (OpenAI GPT4o, O1, Claude, Gemini etc), while need to have strong knowledge on opensource Model’s like Llama 3.2, Phi etc. · Proven track record with event-driven architectures and real-time data processing systems. · Familiarity with Azure DevOps and other LLMOps tools for operationalizing AI workflows. · Deep experience with Azure OpenAI Service and vector DBs, including API integrations, prompt engineering, and model fine-tuning. Or equivalent tech in AWS/GCP. · Knowledge of containerization technologies such as Kubernetes and Docker. · Comprehensive understanding of data lakes and strategies for data management. · Expertise in LLM frameworks including Langchain, Llama Index, and Semantic Kernel. · Proficiency in cloud computing platforms such as Azure or AWS. · Exceptional leadership, problem-solving, and analytical abilities. · Superior communication and collaboration skills, with experience managing high-performing teams. · Ability to operate effectively in a dynamic, fast-paced environment. Preferred skill sets: · Experience with additional technologies such as Datadog, and Splunk. · Programming languages like C#, R, Scala · Possession of relevant solution architecture certificates and continuous professional development in data engineering and Gen AI. Years of experience required: 0-1 Years Education qualification: · BE / B.Tech / MCA / M.Sc / M.E / M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor in Business Administration, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Java Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 6 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Got a Way with Words? Let’s Turn That Talent into Impact. MaSs We’re MaSs – a future-forward digital marketing agency from the UK, with our delivery squad making waves in Kochi. We turn data into stories and ideas into action, crafting copy that doesn’t just sit pretty – it drives results. AI might be rewriting the rules, but we still believe in the power of a human touch. If you can write words that make people stop, think, and act – this is your invitation to do it with us. What’s the Gig? We’re looking for a sharp, word-loving Content & Copywriting Intern/Trainee who wants to dive headfirst into the world of digital storytelling. If you obsess over the perfect headline, love crafting compelling narratives, and are curious about how AI can amplify creativity – you belong here. You’ll work with our team of marketers to create copy that clicks, converts, and keeps audiences hooked. Plus, you’ll learn to leverage AI tools to boost your workflow and sharpen your writing game. What You’ll Do (a.k.a. Your Superpowers) Write attention-grabbing copy for social media, websites, blogs, emails, and ads. Adapt your writing style to match different brands, audiences, and vibes. Collaborate with marketers to turn concepts into persuasive, personality-packed content. Learn how to use AI to brainstorm, edit, and optimize your copy. Stay on top of industry trends and audience psychology to make your writing smarter. Edit and fine-tune your work until it’s pitch-perfect. What We’re Looking For (Is This You?) Wordsmith-in-the-Making: You have a strong command of English (bonus points if you make grammar look effortless). You can make even the dullest topic sound fascinating. You love playing with language and know how to write with personality. Curious & Adaptable: You’re open to trying new writing styles and tones (from cheeky to corporate). You stay curious about how people think, act, and buy. You embrace AI tools to work smarter, not harder. Hungry to Learn: You take feedback like a pro and use it to level up. You’re excited to grow in a fast-paced, ever-evolving creative environment. What’s in It for You? Hands-on experience writing for real clients and real audiences. One-on-one mentorship from marketing pros who want you to shine. Exposure to cutting-edge AI writing tools and techniques. A fun, collaborative workspace where creativity rules. A shot at a full-time gig if you blow us away. Ready to shape stories and spark action? Apply today
Posted 6 days ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About the job Warning - We are a super lean and young team of 70 -+ If growth is what you aspire then we should talk. If you are looking for a 9-6 role, this is NOT for you! We are not glorifying long hours but at this juncture we need HUSTLERS who have a fire to grow and a positive intent. We have your BACK if you have OURS! Please read the full JD As one of our core team members, you'll be helping shape our sales & marketing culture at Skillinabox for Fashion Designing & Make-up Artistry. Working closely with the Founders and the Board, you will be directly responsible to lead our GTM strategies, build a customer base & work with campaigns. A "Beacon Role" to shape our growth in all dimensions and build a team as you grow. Please Note : This is an entrepreneurial role where you will be joining the core team and working with founders as well as the board directly. A transient role, we are looking for someone with high energy & hunger to grow with us and propel their career in Edtech & Skill-tech. You should apply if you - Want to be a founding member of our Sales & Marketing division. Love education & skilling and know the difference between the two. Are inclined towards our mission (Do have a look at our website) Want to work on empowering lives and having a thirst to grow Accept challenges and align your goals with your firm Can take criticism and convert the same into fuel for growth Have a "Never Give In" attitude Love developing relationships with people Key responsibilities areas are mentioned below - 🔹 Team Management Set targets, monitor performance, and conduct reviews. Coach, train, and support team members. Align team with sales strategy and improve processes. Ensure CRM accuracy and pipeline tracking. Motivate the team and maintain a positive culture. 🔹 Individual Contribution Handle key accounts and close deals. Manage personal pipeline and meet targets. Build strong client relationships. Provide market and customer insights. Who are we looking for? 4-6 Years of experience in sales/marketing/business development. Freshers with a fire to grow can be accepted. A highly motivated individual who enjoys building relationships with members and helps drive the adoption of our products and services. Hands-on experience in creating, testing & rolling out campaigns. Willing to get your hands dirty and push forward for growth. Someone who has the knack of hiring people. Has the power to present to an audience Wants to grow at 100x with us Why Us? We are one of a kind vernacular skilling platform disrupting the skilling landscape Skills >> Education is a notion we are spreading through hands on skilling Backed by leading angels in the industry as well as the government With founders and the board having a combined experience of 100+ years in skilling - we definitely know what we are doing We believe people >> product and would love to have you onboard to help scale this ship and make a meaningful impact Amazing growing team! We are growing 100% Month over month! ESOPs for the core team We are solving for Bharat & the world! Who we are not looking for - Anyone looking for a part-time stint If education and skilling don't spark your curiosity & interest Impact creation is something you would not want to work for Not willing to call the team at 2am when a crazy eccentric idea crosses your head - Yes we want you to be supremely proactive Comp & Ben - Current Compensation - ESOP's (Stock Options) + 5-6.5LPA (In-Hand) + 1.5-2LPA (Variable) Your next Appraisal will be in a really short period considering this is a core team position we are hiring for. (4-10 Months) An opportunity to own a part of the firm, having a skin in the game.(ESOPS) Benefit - Become a part of the core team and work directly with founders & the board. Start building a team around you helping you move forward in your role. Last Words If you have read till the end, write a cover letter to careers@skillinabox.in telling us why we should pick you amongst the 2324362327 applications.
Posted 6 days ago
4.0 - 7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Indirect Tax (GST) ▪ Working on monthly GST payment, Return filings and other compliances. ▪ Preparation of GST Input Tax Credit Register. ▪ Reconciliation of Hotels, Airlines, and other vendor GST credit with Purchase register. ▪ Coordinating with Internal Stakeholders, vendors and Hotels to ensure optimum Input Tax Credit on Hotel accommodations. ▪ Reconciliation of GST output liability – GL vs Invoice listing ▪ Reconciliation of GST Input tax credit – GL vs GSTN portal ▪ Collating details for preparing responses to tax officers queries ▪ Provide GST necessary documents (e-way Bill, Invoice, etc.) for Assets movements e.g. transfer of laptops, chargers, etc. ▪ Working with systems team on necessary system enhancements for GST compliances. What You'll Bring Graduate/CA Inter with 4-7 years of relevant experience in Tax & compliance field preferably Indirect Tax. Proficiency in desktop applications and financial systems and ERP general ledger. Must have worked extensively on Windows OS and the following software applications: MS Excel, MSWord and PowerPoint, Oracle or any other large financial systems environment/ERP. Who You'll Work With This position is responsible for managing the GST compliance for the various states where BCG has operations in India. You will ensure working for the timely GST return filings, availing correct inputs Tax credit and completion of Input credit register. You will also be responsible for maintaining various schedules and MIS required for multiple Audits which may be internal, statutory, or even the GST audits, etc. Further, a significant part of this role would include reconciling the hotel and airline input tax credits with the MIS and work closely with the operations team to avail optimum tax credits. You will ensure all processes are completed in a timely and accurate manner in accordance with BCG policies and procedures, and in compliance with statutory regulations. This individual will provide a high level of customer service to internal and external customers and governmental agencies. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Posted 6 days ago
5.0 - 6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
!!!! We are Hiring a Data Scientist (Remote)!!! Job Summary: We are seeking an experienced Data Scientist with 5 to 6 years of hands-on experience to join our analytics and AI/ML team. The ideal candidate has a strong background in statistics, machine learning, data engineering, and business analytics, and is capable of turning complex data into actionable insights and predictive solutions. Key Responsibilities: • Work with stakeholders to identify business problems and translate them into data science solutions. • Build, evaluate, and deploy machine learning and statistical models. • Perform data wrangling, feature engineering, and exploratory data analysis (EDA) on large datasets. • Design and implement data pipelines and ETL processes. • Collaborate with engineering teams to deploy models into production. • Interpret model results, validate assumptions, and communicate findings to non-technical stakeholders. • Continuously monitor model performance and retrain as necessary. • Stay up to date with the latest trends, tools, and technologies in data science and AI. Required Skills & Qualifications: • 5–6 years of experience in a data science or machine learning role. • Strong proficiency in Python (Pandas, NumPy, scikit-learn, TensorFlow/PyTorch) • Solid knowledge of machine learning algorithms, deep learning, NLP, or time series forecasting. • Experience with SQL and relational databases. • Experience with data visualization tools like QuickSight, Power BI, or Matplotlib/Seaborn. • Proficiency with cloud platforms like AWS. • Familiarity with version control systems (Git), Docker, and CI/CD workflows. • Strong communication skills and ability to present technical findings clearly to business stakeholders. Preferred Qualifications: • Master's in Computer Science, Statistics, Data Science, Mathematics, or a related field. • Experience working with big data tools such as Spark or Kafka. • Prior experience in industries such as healthcare or retail. • Background in MLOps or data science productization is a plus. Send the resume to HR@muverity.com
Posted 6 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We're looking for a DevOps Engineer This role is Office Based, Pune Office We are looking for a skilled DevOps Engineer with hands-on experience in Kubernetes, CI/CD pipelines, cloud infrastructure (AWS/GCP), and observability tooling. You will be responsible for automating deployments, maintaining infrastructure as code, and optimizing system reliability, performance, and scalability across environments. In this role, you will… Develop and maintain CI/CD pipelines to automate testing, deployments, and rollbacks across multiple environments. Manage and troubleshoot Kubernetes clusters (EKS, AKS, GKE) including networking, autoscaling, and application deployments. Collaborate with development and QA teams to streamline code integration, testing, and deployment workflows. Automate infrastructure provisioning using tools like Terraform and Helm. Monitor and improve system performance using tools like Prometheus, Grafana, and the ELK stack. Set up and maintain Kibana dashboards, and ensure high availability of logging and monitoring systems. Manage cloud infrastructure on AWS and GCP, optimizing for performance, reliability, and cost. Build unified observability pipelines by integrating metrics, logs, and traces. Participate in on-shift rotations, handling incident response and root cause analysis, and continuously improve automation and observability. Write scripts and tools in Bash, Python, or Go to automate routine tasks and improve deployment efficiency. You’ve Got What It Takes If You Have… 3+ years of experience in a DevOps, SRE, or Infrastructure Engineering role. Bachelor's degree in Computer Science, IT, or related field. Strong understanding of Linux systems, cloud platforms (AWS/GCP), and containerized microservices. Proficiency with Kubernetes, CI/CD systems, and infrastructure automation. Experience with monitoring/logging tools: Prometheus, Grafana, InfluxDB ELK stack (Elasticsearch, Logstash, Kibana) Familiarity with incident management tools (e.g., PagerDuty) and root cause analysis processes. Basic working knowledge of: Kafka – monitoring topics and consumer health ElastiCache/Redis – caching patterns and diagnostics InfluxDB – time-series data and metrics collection Our Culture Spark Greatness. Shatter Boundaries. Share Success. Are you ready? Because here, right now – is where the future of work is happening. Where curious disruptors and change innovators like you are helping communities and customers enable everyone – anywhere – to learn, grow and advance. To be better tomorrow than they are today. Who We Are Cornerstone powers the potential of organizations and their people to thrive in a changing world. Cornerstone Galaxy, the complete AI-powered workforce agility platform, meets organizations where they are. With Galaxy, organizations can identify skills gaps and development opportunities, retain and engage top talent, and provide multimodal learning experiences to meet the diverse needs of the modern workforce. More than 7,000 organizations and 100 million+ users in 180+ countries and in nearly 50 languages use Cornerstone Galaxy to build high-performing, future-ready organizations and people today. Check us out on LinkedIn , Comparably , Glassdoor , and Facebook !
Posted 6 days ago
6.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: Beauty Editor / Content Strategist Location: Mumbai, India About the role: In this role, you will need creative vision, collaboration skills, the ability to understand & work with varied stakeholders and the ability to always see & focus on the big picture. As the Beauty Editor you will own the Editorial and Content strategy for brand & lead innovative & best in class content strategy. You will work in close collaboration with internal cross functional teams, external partners, and agencies to identify content trends and drive adoption for the same among the Indian creator. You will be responsible for executing the editorial planning and publishing strategies, delivering best in class community engagement & will bring to life go-to-market plans for shorts content across owned & operated channels, social, influencer, visual events, and paid This role will also require you to look at bold and exciting ways to activate influencers and events on social. You will have a good understanding of content needs for social platforms, and know what assets work best and where. To thrive in this position - you love working in the area where creative ideation meets execution excellence. Being an awesome leader and teammate, you easily communicate with your team and other partners, including your colleagues across functions. What you will be doing: Team leadership and line management of a social team spread across the region. Constant experimentation and content innovation by identifying break through content Lead the development of brand and cultural narratives. Work with internal and external partners to identify content ideas as well as drive Shorts adoption Execute go-to-market plans and campaigns for latest trends Liaise with stakeholders to align strategy and content plans Lead influencer content planning with PR agencies / internal influencer resources to ensure integration into owned and earned media plans. Approve all content plans, creative lay-downs for channels, monthly narratives and assets Approve of all asset production briefs and monitor creative assets Leadership of all content creation: content creator selection, on-site shoots, asset production Responsible for KPI/ objectives setting and measurement reporting What you need to be great in this role: A minimum of 6 years in content management & strategy across top-level agencies, with a strong Digital background. Must have had a team of at least 4 people reporting into you in the past. Innovative & creative with a clear vision while being detail oriented. Ability to influence key stakeholders & creative problem solving. Excellent planning & organisation skills with the ability to proactively organise and influence stakeholders and build strong and effective working relationships. The ability to effectively manage people through leadership and mentoring. The ability to manage and filter workflow as well as organise and prioritise workloads to maximise productivity. An experienced and passionate creator. Proven track record of projects from concept stage to completion. Highly creative with the ability to generate ideas and practically contribute to studio output. Self-motivated, working with little supervision. Collaborative team player, open minded – nonpolitical. Proven ability to effectively lead creative teams. Proven ability to communicate and liaise with all levels in the business. Discrete about all confidential and personal information. Driven, proactive, helpful, and enthusiastic team player. Req ID: 14057 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.
Posted 6 days ago
0.0 - 1.0 years
0 - 0 Lacs
Jagatpura, Jaipur, Rajasthan
On-site
AWS Data Engineer Location: Jaipur Mode: On-site Experience: 2+ Years The Role Zynsera is looking for a talented AWS Data Engineer to join our dynamic team! If you have a strong grasp of AWS services, serverless data pipelines, and Infrastructure as Code — let’s connect. As an AWS Data Engineer at Zynsera, you will: Develop and optimize data pipelines using AWS Glue, Lambda, and Athena Build infrastructure using AWS CDK for automation and scalability Manage structured and semi-structured data with AWS Lakehouse & Iceberg Design serverless architectures for batch and streaming workloads Collaborate with senior engineers to drive performance and innovation You're a Great Fit If You Have: Proficiency in AWS Glue, Lambda, Athena, and Lakehouse architecture Experience with CDK, Python, PySpark, Spark SQL, or Java/Scala Familiarity with data lakes, data warehousing, and scalable cloud solutions (Bonus) Knowledge of Firehose, Kinesis, Apache Iceberg, or DynamoDB Job Types: Full-time, Permanent Pay: ₹25,316.90 - ₹45,796.55 per month Ability to commute/relocate: Jagatpura, Jaipur, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Experience: AWS Data Engineer: 1 year (Required) Work Location: In person
Posted 6 days ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Join us and drive the design and deployment of AI/ML frameworks revolutionizing telecom services. As a key member of our team, you will architect and build scalable, secure AI systems for service assurance, orchestration, and fulfillment, working directly with network experts to drive business impact. You will be responsible for defining architecture blueprints, selecting the right tools and platforms, and guiding cross-functional teams to deliver scalable AI systems. This role offers significant growth potential, mentorship opportunities, and the chance to shape the future of telecoms using the latest AI technologies and platforms. Key Responsibilities HOW YOU WILL CONTRIBUTE AND WHAT YOU WILL LEARN Design end-to-end AI architecture tailored to telecom services business functions (e.g., Service assurance, Orchestration and Fulfilment). Define data strategy and AI workflows including Inventory Model, ETL, model training, deployment, and monitoring. Evaluate and select AI platforms, tools, and frameworks suited for telecom-scale workloads for development and testing of Inventory services solutions Work closely with telecom network experts and Architects to align AI initiatives with business goals. Ensure scalability, performance, and security in AI systems across hybrid/multi-cloud environments. Mentor AI developers Key Skills And Experience You have: 10+ years' experience in AI/ML design and deployment with a Graduation or equivalent degree. Practical Experience on AI/ML techniques and scalable architecture design for telecom operations, inventory management, and ETL. Exposure to data platforms (Kafka, Spark, Hadoop), model orchestration (Kubeflow, MLflow), and cloud-native deployment (AWS Sagemaker, Azure ML). Proficient in programming (Python, Java) and DevOps/MLOps best practices. It will be nice if you had: Worked with any of the LLM models (llama family) and LLM agent frameworks like LangChain / CrewAI / AutoGen Familiarity with telecom protocols, OSS/BSS platforms, 5G architecture, and NFV/SDN concepts. Excellent communication and stakeholder management skills. About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed. About The Team As Nokia's growth engine, we create value for communication service providers and enterprise customers by leading the transition to cloud-native software and as-a-service delivery models. Our inclusive team of dreamers, doers and disruptors push the limits from impossible to possible.
Posted 6 days ago
10.0 - 12.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
TCS present an excellent opportunity for Data architect Job Description: Skills: AWS, Glue, Redshift, PySpark Location: Pune / Kolkata Experience: 10 to 12 Years Strong hands-on experience in Python programming and PySpark. Experience using AWS services (RedShift, Glue, EMR, S3 & Lambda) Experience working with Apache Spark and Hadoop ecosystem. Experience in writing and optimizing SQL for data manipulations. Good Exposure to scheduling tools. Airflow is preferable. Must – Have Data Warehouse Experience with AWS Redshift or Hive. Experience in implementing security measures for data protection. Expertise to build/test complex data pipelines for ETL processes (batch and near real time) Readable documentation of all the components being developed. Knowledge of Database technologies for OLTP and OLAP workloads.
Posted 6 days ago
6.0 - 10.0 years
35 - 38 Lacs
Ahmedabad, Gujarat, India
On-site
The Role: Lead I Software Engineer The Location: Hyderabad/Ahmedabad, India The Team: We are looking for highly motivated, enthusiastic and skilled software engineer with experience in architecting and building solutions to join an agile scrum team developing technology solutions. The team is responsible for developing and ingesting various datasets into the product platforms utilizing latest technologies. The Impact: Contribute significantly to the growth of the firm by: Developing innovative functionality in existing and new products Supporting and maintaining high revenue products Achieve the above intelligently and economically using best practices. What's in it for you: Build a career with a global company. Work on products that fuels the global financial markets. Grow and improve your skills by working on enterprise level products and new technologies. Responsibilities: Architect, design, and implement software related projects. Perform analysis and articulate solutions. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Collaborate effectively with technical and non-technical stakeholders Active participation in all scrum ceremonies following Agile principles and best practices What We're Looking For: Basic Qualifications: Bachelor's degree in computer science or equivalent 6 to 10 years' experience in application development Willingness to learn and apply new technologies. Excellent communication skills are essential, with strong verbal and writing proficiencies. Good work ethic, self-starter, and results-oriented Excellent problem-solving & troubleshooting skills Ability to manage multiple priorities efficiently and effectively within specific timeframes Strong hands-on development experience in C#, python Strong hands on experience in building large scale solutions using big data technology stack like Spark and microservice architecture and tools like Docker and Kubernetes. Experience in conducting application design and code reviews Able to demonstrate strong OOP skills Proficient with software development lifecycle (SDLC) methodologies like Agile, Test- driven development. Experience implementing Web Services Have experience working with SQL Server. Ability to write stored procedures, triggers, performance tuning etc Experience working in cloud computing environments such as AWS Prefered Qualifications: Experience with large scale messaging systems such as Kafka is a plus Experience working with Big data technologies like Elastic Search, Spark is a plus Experience working with Snowflake is a plus Experience with Linux based environment is a plus
Posted 6 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Python PySpark ETL Data Pipeline Big Data AWS GCP Azure Data Warehousing Spark Hadoop A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management
Posted 6 days ago
5.0 - 8.0 years
0 Lacs
India
Remote
Mandatory skill- Azure Databricks, Datafactory, Pyspark, SQL Experience- 5 to 8 years Location- Remote Key Responsibilities: Design and build data pipelines and ETL/ELT workflows using Azure Databricks and Azure Data Factory Ingest, clean, transform, and process large datasets from diverse sources (structured and unstructured) Implement Delta Lake solutions and optimize Spark jobs for performance and reliability Integrate Azure Databricks with other Azure services including Data Lake Storage, Synapse Analytics, and Event Hubs
Posted 6 days ago
4.0 years
15 - 30 Lacs
Gurugram, Haryana, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Cuttack, Odisha, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Bhubaneswar, Odisha, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Kolkata, West Bengal, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Guwahati, Assam, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Raipur, Chhattisgarh, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Jamshedpur, Jharkhand, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Ranchi, Jharkhand, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Amritsar, Punjab, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Surat, Gujarat, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
4.0 years
15 - 30 Lacs
Ahmedabad, Gujarat, India
Remote
Experience : 4.00 + years Salary : INR 1500000-3000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NuStudio.AI) (*Note: This is a requirement for one of Uplers' client - AI-first, API-powered Data Platform) What do you need for this opportunity? Must have skills required: Databricks, dbt, Delta Lake, Spark, Unity catalog, AI, Airflow, Cloud Function, Cloud Storage, Databricks Workflows, Dataflow, ETL/ELT, Functions), GCP (BigQuery, Pub/Sub, PySpark, AWS, Hadoop AI-first, API-powered Data Platform is Looking for: We’re scaling our platform and seeking Data Engineers (who are passionate about building high-performance data pipelines, products, and analytical pipelines in the cloud to power real-time AI systems. As a Data Engineer, you’ll: Build scalable ETL/ELT and streaming data pipelines using GCP (BigQuery, Pub/Sub, PySpark, Dataflow, Cloud Storage, Functions) Orchestrate data workflows with Airflow, Cloud Functions, or Databricks Workflows Work across batch + real-time architectures that feed LLMs and AI/ML systems Own feature engineering pipelines that power production models and intelligent agents Collaborate with platform and ML teams to design observability, lineage, and cost-aware performant solutions Bonus: Experience with AWS, Databricks, Hadoop (Delta Lake, Spark, dbt, Unity Catalog) or interest in building on it Why Us? Building production-grade data & AI solutions Your pipelines directly impact mission-critical and client-facing interactions Lean team, no red tape — build, own, ship Remote-first with async culture that respects your time Competitive comp and benefits Our Stack: Python, SQL, GCP/Azure/AWS, Spark, Kafka, Airflow, Databricks, Spark, dbt, Kubernetes, LangChain, LLMs How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France