4 Ofertas de Database Management en Argentina
Data Modeling Junior Manager – Marketing Effectiveness
Publicado hace 6 días
Trabajo visto
Descripción Del Trabajo
Job Description
The Modelling Manager will be involved in projects from a modelling perspective.
The responsibilities of this position will start from understanding project briefs from internal stakeholders, identify the right analytic solutions from product portfolio that will answer client questions, guide the team members to execute them error-free and makes sure to deliver results on time.
The work will be heavy on data analyses, statistical modelling, and finally presenting these from a business/non-technical point of view. The expectation is to carry out analysis, lead the project and modelling independently with minimal support.
Responsibilities
· Execute analyses error-free and on time, and also ensure the same for the team he / she would be leading
· Manage execution of multiple analyses within project and work towards building efficiencies by identifying faster and simpler ways to improve the existing solutions
· Lead discussions with internal stakeholders and effectively able to tackle all challenges in the assigned project
· Carry out multiple standard/non-standard analyses to help build quality and insightful proposals and final insights presentations.
· Gain a detailed understanding and develop expertise in existing analytical solutions
· Basis the experience in team, should be able to assess final model results, how to story-board the results to clients, and also guide other team members in model finalization.
· Always look out and experiment with different techniques/methods to improve existing solutions
· Help in R&D studies by carrying out analyses as planned
QualificationsÏ Excellent, clear and confident communication.
Ï 5-7+ years of experience in a data analysis and modelling role
Ï Candidate aware of the market research, FMCG/T&D industry is preferred.
Ï Experience with Market Mix Modelling (MMM) is preferred
Ï Able to work collaboratively in a team and guide junior associates.
Ï Thorough understanding and working knowledge of Statistical techniques like Regression, Cluster and Factor analyses, Forecasting, Significance testing.
Ï Hands on with data processing/programming tools R/Python.
Ï Hands on with data processing on Python is preferred.
Ï Proficiency in Microsoft Excel for data analysis, and reporting.
Ï Strong analytical and problem-solving skills with attention to detail.
Additional InformationOur Benefits
- Flexible working environment
- Volunteer time off
- LinkedIn Learning
- Employee-Assistance-Program (EAP)
About NIQ
NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population.
For more information, visit NIQ.com
Want to keep up with our latest updates?
Follow us on:LinkedIn |Instagram |Twitter |Facebook
Our commitment to Diversity, Equity, and Inclusion
NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center:
#J-18808-LjbffrData Engineer - Data Pipelines & Modeling
Hoy
Trabajo visto
Descripción Del Trabajo
This position is only for professionals based in Argentina or Uruguay
We're looking for a data engineer for one of our clients' team. You will help enhance and scale the data transformation and modeling layer. This role will focus on building robust, maintainable pipelines using dbt, Snowflake, and Airflow to support analytics and downstream applications. You’ll work closely with the data, analytics, and software engineering teams to create scalable data models, improve pipeline orchestration, and ensure trusted, high-quality data delivery.
Key Responsibilities:
- Design, implement, and optimize data pipelines that extract, transform, and load data into Snowflake from multiple sources using Airflow and AWS services
- Build modular, well-documented dbt models with strong test coverage to serve business reporting, lifecycle marketing, and experimentation use cases
- Partner with analytics and business stakeholders to define source-to-target transformations and implement them in dbt
- Maintain and improve our orchestration layer ( Airflow/Astronomer ) to ensure reliability, visibility, and efficient dependency management
- Collaborate on data model design best practices, including dimensional modeling, naming conventions, and versioning strategies
Core Skills & Experience:
- dbt: Hands-on experience developing dbt models at scale, including use of macros, snapshots, testing frameworks, and documentation. Familiarity with dbt Cloud or CLI workflows
- Snowflake: Strong SQL skills and understanding of Snowflake architecture, including query performance tuning, cost optimization, and use of semi-structured data
- Airflow: Solid experience managing Airflow DAGs, scheduling jobs, and implementing retry logic and failure handling; familiarity with Astronomer is a plus
- Data Modeling: Proficient in dimensional modeling and building reusable data marts that support analytics and operational use cases
- AWS (Nice to Have): Familiarity with AWS services such as DMS, Kinesis, and Firehose for ingesting and transforming data
- Segment (Nice to Have): Familiarity with event data and related flows, piping data in and out of Segment
#J-18808-LjbffrData Engineer – Data Pipelines & Modeling
Publicado hace 11 días
Trabajo visto
Descripción Del Trabajo
This position is only for professionals based in Argentina or Uruguay
We're looking for a data engineer for one of our clients' team. You will help enhance and scale the data transformation and modeling layer. This role will focus on building robust, maintainable pipelines using dbt, Snowflake, and Airflow to support analytics and downstream applications. You’ll work closely with the data, analytics, and software engineering teams to create scalable data models, improve pipeline orchestration, and ensure trusted, high-quality data delivery.
Key Responsibilities:
- Design, implement, and optimize data pipelines that extract, transform, and load data into Snowflake from multiple sources using Airflow and AWS services
- Build modular, well-documented dbt models with strong test coverage to serve business reporting, lifecycle marketing, and experimentation use cases
- Partner with analytics and business stakeholders to define source-to-target transformations and implement them in dbt
- Maintain and improve our orchestration layer (Airflow/Astronomer ) to ensure reliability, visibility, and efficient dependency management
- Collaborate on data model design best practices, including dimensional modeling, naming conventions, and versioning strategies
Core Skills & Experience:
- dbt: Hands-on experience developing dbt models at scale, including use of macros, snapshots, testing frameworks, and documentation. Familiarity with dbt Cloud or CLI workflows
- Snowflake: Strong SQL skills and understanding of Snowflake architecture, including query performance tuning, cost optimization, and use of semi-structured data
- Airflow: Solid experience managing Airflow DAGs, scheduling jobs, and implementing retry logic and failure handling; familiarity with Astronomer is a plus
- Data Modeling: Proficient in dimensional modeling and building reusable data marts that support analytics and operational use cases
- AWS (Nice to Have): Familiarity with AWS services such as DMS, Kinesis, and Firehose for ingesting and transforming data
- Segment (Nice to Have): Familiarity with event data and related flows, piping data in and out of Segment
#J-18808-Ljbffr731 - Data Scientist Ssr/Sr (Ai-Ml/Modeling In Retail/Cpg) · Latam
Publicado hace 6 días
Trabajo visto
Descripción Del Trabajo
Remoto
- Proyectos internacionales
- Darwoft
¿Eres un Senior Data Scientist con experiencia en Python, modelos predictivos y enfoque en retail/CPG? En Darwoft , buscamos 4 talentos excepcionales para proyectos innovadores y desarrollar soluciones que marcan la diferencia.
¿Qué harás?- Exploración y análisis de datos: Procesar y analizar datos estructurados y no estructurados para descubrir tendencias y generar insights estratégicos.
- Desarrollo de modelos: Diseñar, construir y evaluar modelos predictivos y algoritmos de machine learning, especialmente relacionados con elasticidad de precios y pronósticos.
- Prototipos y soluciones: Crear prototipos rápidos (POCs y MVPs) que resuelvan problemas reales de negocio y se transformen en soluciones accionables.
- Visualización y comunicación: Presentar insights complejos a través de visualizaciones claras, con recomendaciones basadas en datos para stakeholders técnicos y no técnicos.
- Colaboración: Trabajar en conjunto con equipos de ingeniería y desarrollo para garantizar la escalabilidad y fiabilidad de los modelos en producción.
- Educación: Título universitario o superior en Data Science, Estadística, Ciencias de la Computación o áreas afines.
- Experiencia: Al menos 2 años en retail/CPG, trabajando con modelos de precios y pronósticos de demanda.
- Habilidades técnicas clave: Programación avanzada en Python (pandas, programación orientada a objetos), SQL; análisis de series temporales, modelos de optimización y pronóstico; creación de prototipos, QA y optimización de modelos para producción; interpretación de datos complejos y comunicación de insights.
Habilidades preferidas: Familiaridad con herramientas de ML como PyTorch, TensorFlow, scikit-learn, Jupyter Notebooks, Numpy; experiencia en procesamiento de big data con Spark, PySpark o Azure Databricks; desarrollo de interfaces con Flask, Plotly o Streamlit; conocimiento de metodologías de integración continua, pruebas, despliegues y releases.
¿Por qué unirte a Darwoft?En Darwoft, compartimos pasión por la innovación y el impacto. Tendrás la oportunidad de trabajar en proyectos internacionales desafiantes, colaborando con colegas altamente capacitados. ¡Nos enfocamos en tu crecimiento profesional!
Postúlate ahoraEnvíanos tu CV a
¿Tienes dudas?Conecta conmigo: Hernán Vietto
Esta es una excelente oportunidad para quienes buscan desafiarse en un entorno dinámico y contribuir a proyectos que hacen una diferencia en la vida de las personas. Si tienes pasión por la tecnología y la salud, ¡sumate a Darwoft!
¡Postulate ahora y sumate a la querida Darwoft!
Correo:
¿Consultas? Sigue al Recruiter: Hernán Vietto
#J-18808-LjbffrSé el primero en saberlo
Acerca de lo último Database management Empleos en Argentina !