Chennai, Tamil Nadu, India
Not disclosed
Remote
Full Time
Role: Fulltime- Permanent Location: Chennai Hybrid: 2 Days WFO, 3 Days WFH Preferably immediate joiners Key Skills: Strong GCP Cloud experience Proficiency in AI tools used to prepare and automate data pipelines and ingestion Apache Spark, especially with MLlib PySpark and Dask for distributed data processing Pandas and NumPy for local data wrangling Apache Airflow – schedule and orchestrate ETL/ELT jobs Google Cloud (BigQuery, Vertex AI) Python (most popular for AI and data tasks) About us OneMagnify is a global performance marketing company that blends brand strategy, data analytics, and cutting-edge technology to drive measurable business results. With a strong focus on innovation and collaboration, OneMagnify partners with clients to create personalized marketing solutions that enhance customer engagement and deliver real-time impact. The company is also known for its award-winning workplace culture, emphasizing employee growth and inclusion. 🌟 Why Join OneMagnify? Top Workplace: Consistently recognized in the U.S. & India for a great work culture. Cutting-Edge Tech: Work with modern tools like Databricks, Snowflake, Azure, and MLflow. Growth-Focused: Strong career paths, upskilling, and learning opportunities. Global Impact: Collaborate across teams on high-impact, data-driven projects. Great Benefits: Competitive salary, insurance, paid holidays, and more. Meaningful Work: Solve real-world business challenges with innovative solutions. Show more Show less
Chennai, Tamil Nadu, India
Not disclosed
On-site
Full Time
Job Description Team: Data Platform, Application & Data Reports to: Engineering Manager, Data Platform, Application & Data Minimum Education: Bachelor’s Degree or Equivalent Experience Recommended Tenure: 7+ years in data platform engineering or architecture roles, including at least 3 years in a hands-on architecture role Please fill in your details here for a quicker response: https://forms.gle/7gax5tbJUDBxfhPCA Role Summary: The Data Architect is responsible for designing and managing robust, scalable data solutions that ensure data accuracy, accessibility, and security to support both internal and client-facing needs. As part of the Data Platform team, this role collaborates closely with Solution Architects to align data architecture with broader application design, and partners with data engineers to implement and optimize solutions that power analytics, integrations, and real-time processing. Architecture & Design: Translate business and technical requirements into data models and architecture specifications Design and document data architecture artifacts, including logical/physical data models, data flow diagrams, and integration patterns Align data models with application architecture and system interaction patterns in partnership with Solution Architects Establish and maintain design patterns for relational, NoSQL, and streaming-based data solutions Solution Delivery & Support: Serve as a hands-on architecture lead during project discovery, planning, and delivery phases Support data engineers in implementing data architecture that aligns with platform and project requirements Validate implementation through design reviews and provide guidance throughout the development lifecycle Contribute to platform evolution by defining and enforcing scalable, reusable architecture practices Data Governance & Quality: Define and uphold best practices for data modeling, data security, lineage tracking, and performance tuning Promote consistency in metadata, naming conventions, and data access standards across environments Support data privacy, classification, and auditability across integrated systems Cross-Functional Collaboration: Work closely with product managers, engineering leads, DevOps, and analytics teams to deliver scalable and future-proof data solutions Collaborate with Solution Architects to ensure integrated delivery across application and data domains Act as a subject matter expert on data structure, semantics, and lifecycle across key business domains Key Competencies: 7+ years of experience in data engineering or data architecture roles, including 3+ years in a dedicated architecture capacity Proven experience in cloud platforms—preferably GCP and/or Azure—with strong familiarity with native data services Deep understanding of data storage paradigms including relational, NoSQL, and object storage Hands-on experience with databases such as Oracle and Postgres; Python proficiency preferred Familiarity with modern DevOps practices including infrastructure-as-code and CI/CD for data pipelines Strong communication skills with the ability to lead through influence across technical and non-technical audiences Self-starter with excellent organization and prioritization skills across multiple initiatives Please fill in your details here for a quicker response: https://forms.gle/7gax5tbJUDBxfhPCA Show more Show less
Chennai, Tamil Nadu, India
Not disclosed
On-site
Full Time
Job Description OneMagnify is a global performance marketing organization working at the intersection of brand marketing, technology, and analytics. The Company’s core offerings accelerate business, amplify real-time results, and help set their clients apart from their competitors. OneMagnify partners with clients to design, implement and manage marketing and brand strategies using analytical and predictive data models that provide valuable customer insights to drive higher levels of sales conversion. OneMagnify’s commitment to employee growth and development extends far beyond typical approaches. We take great pride in fostering an environment where each of our 700+ colleagues can thrive and achieve their personal best. OneMagnify has been recognized as a Top Workplace, Best Workplace and Cool Workplace in the United States for 10 consecutive years and recently was recognized as a Top Workplace in India. The Data Engineering team is a dynamic group that is dedicated to transforming raw data into actionable insights. As a Databricks Engineer, you will architect, build, and maintain our data infrastructure on the Databricks Lakehouse Platform. You will collaborate with data scientists, analysts, and engineers to deliver world-class data solutions that drive our business forward. About You: Eager to address complex technical challenges with a strong engineering approach. Outstanding execution and care for delivering robust and efficient data solutions. A deep understanding of cloud-based data technologies and standard methodologies for data engineering. Ability to develop and implement scalable data models and data warehousing solutions using Databricks. What you’ll do: Architect, develop, and deploy scalable and reliable data infrastructure and pipelines using Databricks and Spark. Design and implement data models and data warehousing solutions with a focus on performance and scalability. Optimize data processing frameworks and infrastructure for maximum efficiency and cost-effectiveness. Collaborate with data scientists and analysts to understand their data needs and engineer solutions. Implement robust data quality frameworks, monitoring systems, and alerting mechanisms. Design, build, and maintain efficient ETL/ELT processes. Integrate Databricks with various data sources, systems, and APIs. Contribute to the definition and implementation of data governance, security, and compliance policies. Stay current with the latest advancements in Databricks, cloud data engineering standard methodologies, and related technologies. What you’ll need: Bachelor's degree in Computer Science, Engineering, or a related technical field (or equivalent practical experience). 5+ years of experience in data engineering or a similar role with a strong emphasis on building and maintaining data infrastructure. Deep understanding and practical experience with the Databricks Lakehouse Platform and its core engineering aspects. Expert-level proficiency in working with big data processing frameworks, particularly Apache Spark. Strong hands-on experience with programming languages such as Python (PySpark) and/or Scala. Solid experience with SQL and data warehousing principles, including schema design and performance tuning. Proven experience with cloud platforms such as AWS, Azure, or GCP. Comprehensive understanding of data modeling, ETL/ELT architecture, and data quality engineering principles. Excellent problem-solving, analytical, and debugging skills. Strong communication and collaboration skills, with the ability to explain technical concepts to both technical and non-technical audiences. Benefits We offer a comprehensive benefits package including Medical Insurance, PF, Gratuity, paid holidays, and more. About us Whether it’s awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges. We are an equal opportunity employer We believe that Innovative ideas and solutions start with unique perspectives. That’s why we’re committed to providing every employee a workplace that’s free of discrimination and intolerance. We’re proud to be an equal opportunity employer and actively search for like-minded people to join our team Fill your details in below link quick response: https://docs.google.com/forms/d/e/1FAIpQLSedUA2bDNz35xC-o3A9Pt6hvOvZo24aUPfU850vZlPvOQRR_w/viewform?usp=preview Show more Show less
chennai, tamil nadu
INR Not disclosed
On-site
Full Time
You will be responsible for developing and maintaining web applications using React.js, HTML, and CSS. This includes building responsive web applications and designing reusable UI components and libraries. You will integrate and consume RESTful APIs and/or GraphQL APIs to facilitate dynamic data interactions within the application. In addition, you will optimize application performance by identifying and addressing bottlenecks and improving code efficiency. Collaboration with designers, product managers, and backend developers is essential to ensure alignment with project requirements and goals. You will be expected to write clean, maintainable, and well-documented code, participate in code reviews, and contribute to the improvement of development practices. Troubleshooting issues related to UI, performance, and data interactions and staying updated with the latest industry trends, tools, and technologies related to web development and React.js are also key responsibilities. The ideal candidate should have a minimum of 8 to 10 years of professional experience in front-end web development with a focus on React.js, HTML, CSS, and API integration. Experience with eCommerce platforms such as Shopify, BigCommerce, Adobe Commerce, etc., and/or CMS platforms like Optimizely CMS, Contentful, Contentstack, Storyblock.io is preferred. Proficiency in React.js and related libraries, expertise in modern HTML5 and CSS3, and experience with responsive design principles and frameworks like Bootstrap and Tailwind CSS are required. Familiarity with consuming and integrating RESTful APIs and/or GraphQL APIs, version control systems (preferably Git), strong analytical and problem-solving skills, excellent communication skills, and the ability to work effectively in a collaborative team environment are essential. A Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience) is also desired.,
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.