6 Ofertas de Database Developer en Argentina

Database Developer

Bridgenext

Hoy

Trabajo visto

Toque nuevamente para cerrar

Descripción Del Trabajo

Company Overview

Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services, while elevating brands through digital experience, creative content, and customer data analytics services.

Don't just work, thrive. At Bridgenext, you have an opportunity to make a real difference - driving tangible business value for clients, while simultaneously propelling your own career growth. Our flexible and inclusive work culture provides you with the autonomy, resources, and opportunities to succeed.

Position Description

We are looking for a talented and experienced Database Development Engineer to join our team in Latin America. Bridgenext is seeking a Database Development Engineer who will be expected to work autonomously and contribute to several high-impact projects including architecting the data pipeline infrastructure to support our customer's data needs.

You will be part of a team which will have on-call duties outside CST business hours as well and you would need to respond to high-priority incident pager.

Responsibilities:

  • Provide thought leadership in terms of governance and best practices for DB (for MS-SQL and Postgres-SQL) design, security, and usage across the enterprise application with the ability to communicate and influence throughout the enterprise
  • Mentor the data engineering team on maintenance/administration/security
  • Mentor and lead development and application engineering teams on MS-SQL and Postgres-SQL optimization and tuning
  • Guide and work on architecture and partitioning decisions to balance scalability and consistency
  • Optimize performance and cost through indexing, query tuning, and debugging Stored Procedures
  • Optimize cloud costs versus performance with DB (for MS-SQL and Postgres-SQL)
  • Automate monitoring, alerting, and backups for availability, performance, and compliance
  • Exposure to Azure and NoSQL is plus

Workplace: Full-time working hours. Remote, working from home (working in CST) and based in Latin America.

Must Have Skills:

  • 5+ years of experience designing data pipelines using Data Build tools (DBT)/Azure Data Factory
  • 5+ years of hands-on experience in data and analytics
  • 3+ years of proficiency with MS-SQL and Postgres-SQL
  • 1+ year of experience working with PowerShell
  • Solid understanding of database best practices
  • Strong verbal and written communication skills
  • Ability to work independently and collaboratively in a fast-paced environment

Preferred Skills:

  • Experience with additional data processing tools and frameworks
  • Familiarity with cloud-based data solutions
  • Exposure to data security and governance best practices
  • Experience with distributed data processing systems
  • Strong understanding of performance optimization techniques for data queries

Professional Skills:

  • Solid written, verbal, and presentation communication skills
  • Strong team and individual player
  • Maintains composure during all types of situations and is collaborative by nature
  • High standards of professionalism, consistently producing high quality results
  • Self-sufficient, independent requiring very little supervision or intervention
  • Demonstrate flexibility and openness to bring creative solutions to address issues

Bridgenext is an Equal Opportunity Employer

#LI-FR1

#LI-REMOTE

#J-18808-Ljbffr
Lo sentimos, este trabajo no está disponible en su región

Database Developer

Bridgenext

Publicado hace 11 días

Trabajo visto

Toque nuevamente para cerrar

Descripción Del Trabajo

Company Overview

Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services, while elevating brands through digital experience, creative content, and customer data analytics services.

Don't just work, thrive. At Bridgenext, you have an opportunity to make a real difference - driving tangible business value for clients, while simultaneously propelling your own career growth. Our flexible and inclusive work culture provides you with the autonomy, resources, and opportunities to succeed.

Position Description

We are looking for a talented and experienced Database Development Engineer to join our team in Latin America. Bridgenext is seeking a Database Development Engineer who will be expected to work autonomously and contribute to several high-impact projects including architecting the data pipeline infrastructure to support our customer's data needs.

You will be part of a team which will have on-call duties outside CST business hours as well and you would need to respond to high-priority incident pager.

Responsibilities:

  • Provide thought leadership in terms of governance and best practices for DB (for MS-SQL and Postgres-SQL) design, security, and usage across the enterprise application with the ability to communicate and influence throughout the enterprise
  • Mentor the data engineering team on maintenance/administration/security
  • Mentor and lead development and application engineering teams on MS-SQL and Postgres-SQL optimization and tuning
  • Guide and work on architecture and partitioning decisions to balance scalability and consistency
  • Optimize performance and cost through indexing, query tuning, and debugging Stored Procedures
  • Optimize cloud costs versus performance with DB (for MS-SQL and Postgres-SQL)
  • Automate monitoring, alerting, and backups for availability, performance, and compliance
  • Exposure to Azure and NoSQL is plus

Workplace: Full-time working hours. Remote, working from home (working in CST) and based in Latin America.

Must Have Skills:

  • 5+ years of experience designing data pipelines using Data Build tools (DBT)/Azure Data Factory
  • 5+ years of hands-on experience in data and analytics
  • 3+ years of proficiency with MS-SQL and Postgres-SQL
  • 1+ year of experience working with PowerShell
  • Solid understanding of database best practices
  • Strong verbal and written communication skills
  • Ability to work independently and collaboratively in a fast-paced environment

Preferred Skills:

  • Experience with additional data processing tools and frameworks
  • Familiarity with cloud-based data solutions
  • Exposure to data security and governance best practices
  • Experience with distributed data processing systems
  • Strong understanding of performance optimization techniques for data queries

Professional Skills:

  • Solid written, verbal, and presentation communication skills
  • Strong team and individual player
  • Maintains composure during all types of situations and is collaborative by nature
  • High standards of professionalism, consistently producing high quality results
  • Self-sufficient, independent requiring very little supervision or intervention
  • Demonstrate flexibility and openness to bring creative solutions to address issues

Bridgenext is an Equal Opportunity Employer

#LI-FR1

#LI-REMOTE

#J-18808-Ljbffr
Lo sentimos, este trabajo no está disponible en su región

Data Modeling Junior Manager – Marketing Effectiveness

Buenos Aires NielsenIQ

Publicado hace 6 días

Trabajo visto

Toque nuevamente para cerrar

Descripción Del Trabajo

Job Description

The Modelling Manager will be involved in projects from a modelling perspective.

The responsibilities of this position will start from understanding project briefs from internal stakeholders, identify the right analytic solutions from product portfolio that will answer client questions, guide the team members to execute them error-free and makes sure to deliver results on time.

The work will be heavy on data analyses, statistical modelling, and finally presenting these from a business/non-technical point of view. The expectation is to carry out analysis, lead the project and modelling independently with minimal support.

Responsibilities

· Execute analyses error-free and on time, and also ensure the same for the team he / she would be leading

· Manage execution of multiple analyses within project and work towards building efficiencies by identifying faster and simpler ways to improve the existing solutions

· Lead discussions with internal stakeholders and effectively able to tackle all challenges in the assigned project

· Carry out multiple standard/non-standard analyses to help build quality and insightful proposals and final insights presentations.

· Gain a detailed understanding and develop expertise in existing analytical solutions

· Basis the experience in team, should be able to assess final model results, how to story-board the results to clients, and also guide other team members in model finalization.

· Always look out and experiment with different techniques/methods to improve existing solutions

· Help in R&D studies by carrying out analyses as planned

Qualifications

Ï Excellent, clear and confident communication.

Ï 5-7+ years of experience in a data analysis and modelling role

Ï Candidate aware of the market research, FMCG/T&D industry is preferred.

Ï Experience with Market Mix Modelling (MMM) is preferred

Ï Able to work collaboratively in a team and guide junior associates.

Ï Thorough understanding and working knowledge of Statistical techniques like Regression, Cluster and Factor analyses, Forecasting, Significance testing.

Ï Hands on with data processing/programming tools R/Python.

Ï Hands on with data processing on Python is preferred.

Ï Proficiency in Microsoft Excel for data analysis, and reporting.

Ï Strong analytical and problem-solving skills with attention to detail.

Additional Information

Our Benefits

  • Flexible working environment
  • Volunteer time off
  • LinkedIn Learning
  • Employee-Assistance-Program (EAP)

About NIQ

NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population.

For more information, visit NIQ.com

Want to keep up with our latest updates?

Follow us on:LinkedIn |Instagram |Twitter |Facebook

Our commitment to Diversity, Equity, and Inclusion

NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center:

#J-18808-Ljbffr
Lo sentimos, este trabajo no está disponible en su región

Data Engineer - Data Pipelines & Modeling

Buenos Aires, Buenos Aires Ryz Labs

Hoy

Trabajo visto

Toque nuevamente para cerrar

Descripción Del Trabajo

This position is only for professionals based in Argentina or Uruguay

We're looking for a data engineer for one of our clients' team. You will help enhance and scale the data transformation and modeling layer. This role will focus on building robust, maintainable pipelines using dbt, Snowflake, and Airflow to support analytics and downstream applications. You’ll work closely with the data, analytics, and software engineering teams to create scalable data models, improve pipeline orchestration, and ensure trusted, high-quality data delivery.

Key Responsibilities:

- Design, implement, and optimize data pipelines that extract, transform, and load data into Snowflake from multiple sources using Airflow and AWS services

- Build modular, well-documented dbt models with strong test coverage to serve business reporting, lifecycle marketing, and experimentation use cases

- Partner with analytics and business stakeholders to define source-to-target transformations and implement them in dbt

- Maintain and improve our orchestration layer ( Airflow/Astronomer ) to ensure reliability, visibility, and efficient dependency management

- Collaborate on data model design best practices, including dimensional modeling, naming conventions, and versioning strategies

Core Skills & Experience:

- dbt: Hands-on experience developing dbt models at scale, including use of macros, snapshots, testing frameworks, and documentation. Familiarity with dbt Cloud or CLI workflows

- Snowflake: Strong SQL skills and understanding of Snowflake architecture, including query performance tuning, cost optimization, and use of semi-structured data

- Airflow: Solid experience managing Airflow DAGs, scheduling jobs, and implementing retry logic and failure handling; familiarity with Astronomer is a plus

- Data Modeling: Proficient in dimensional modeling and building reusable data marts that support analytics and operational use cases

- AWS (Nice to Have): Familiarity with AWS services such as DMS, Kinesis, and Firehose for ingesting and transforming data

- Segment (Nice to Have): Familiarity with event data and related flows, piping data in and out of Segment

#J-18808-Ljbffr
Lo sentimos, este trabajo no está disponible en su región

Data Engineer – Data Pipelines & Modeling

Buenos Aires Ryz Labs

Publicado hace 11 días

Trabajo visto

Toque nuevamente para cerrar

Descripción Del Trabajo

This position is only for professionals based in Argentina or Uruguay

We're looking for a data engineer for one of our clients' team. You will help enhance and scale the data transformation and modeling layer. This role will focus on building robust, maintainable pipelines using dbt, Snowflake, and Airflow to support analytics and downstream applications. You’ll work closely with the data, analytics, and software engineering teams to create scalable data models, improve pipeline orchestration, and ensure trusted, high-quality data delivery.

Key Responsibilities:

- Design, implement, and optimize data pipelines that extract, transform, and load data into Snowflake from multiple sources using Airflow and AWS services

- Build modular, well-documented dbt models with strong test coverage to serve business reporting, lifecycle marketing, and experimentation use cases

- Partner with analytics and business stakeholders to define source-to-target transformations and implement them in dbt

- Maintain and improve our orchestration layer (Airflow/Astronomer ) to ensure reliability, visibility, and efficient dependency management

- Collaborate on data model design best practices, including dimensional modeling, naming conventions, and versioning strategies

Core Skills & Experience:

- dbt: Hands-on experience developing dbt models at scale, including use of macros, snapshots, testing frameworks, and documentation. Familiarity with dbt Cloud or CLI workflows

- Snowflake: Strong SQL skills and understanding of Snowflake architecture, including query performance tuning, cost optimization, and use of semi-structured data

- Airflow: Solid experience managing Airflow DAGs, scheduling jobs, and implementing retry logic and failure handling; familiarity with Astronomer is a plus

- Data Modeling: Proficient in dimensional modeling and building reusable data marts that support analytics and operational use cases

- AWS (Nice to Have): Familiarity with AWS services such as DMS, Kinesis, and Firehose for ingesting and transforming data

- Segment (Nice to Have): Familiarity with event data and related flows, piping data in and out of Segment



#J-18808-Ljbffr
Lo sentimos, este trabajo no está disponible en su región

731 - Data Scientist Ssr/Sr (Ai-Ml/Modeling In Retail/Cpg) · Latam

Cordoba Darwoft

Publicado hace 6 días

Trabajo visto

Toque nuevamente para cerrar

Descripción Del Trabajo

Senior Data Scientist AI/ML Modeling en Retail/CPG

Remoto

  • Proyectos internacionales
  • Darwoft

¿Eres un Senior Data Scientist con experiencia en Python, modelos predictivos y enfoque en retail/CPG? En Darwoft , buscamos 4 talentos excepcionales para proyectos innovadores y desarrollar soluciones que marcan la diferencia.

¿Qué harás?
  • Exploración y análisis de datos: Procesar y analizar datos estructurados y no estructurados para descubrir tendencias y generar insights estratégicos.
  • Desarrollo de modelos: Diseñar, construir y evaluar modelos predictivos y algoritmos de machine learning, especialmente relacionados con elasticidad de precios y pronósticos.
  • Prototipos y soluciones: Crear prototipos rápidos (POCs y MVPs) que resuelvan problemas reales de negocio y se transformen en soluciones accionables.
  • Visualización y comunicación: Presentar insights complejos a través de visualizaciones claras, con recomendaciones basadas en datos para stakeholders técnicos y no técnicos.
  • Colaboración: Trabajar en conjunto con equipos de ingeniería y desarrollo para garantizar la escalabilidad y fiabilidad de los modelos en producción.
¿Qué buscamos?
  • Educación: Título universitario o superior en Data Science, Estadística, Ciencias de la Computación o áreas afines.
  • Experiencia: Al menos 2 años en retail/CPG, trabajando con modelos de precios y pronósticos de demanda.
  • Habilidades técnicas clave: Programación avanzada en Python (pandas, programación orientada a objetos), SQL; análisis de series temporales, modelos de optimización y pronóstico; creación de prototipos, QA y optimización de modelos para producción; interpretación de datos complejos y comunicación de insights.

Habilidades preferidas: Familiaridad con herramientas de ML como PyTorch, TensorFlow, scikit-learn, Jupyter Notebooks, Numpy; experiencia en procesamiento de big data con Spark, PySpark o Azure Databricks; desarrollo de interfaces con Flask, Plotly o Streamlit; conocimiento de metodologías de integración continua, pruebas, despliegues y releases.

¿Por qué unirte a Darwoft?

En Darwoft, compartimos pasión por la innovación y el impacto. Tendrás la oportunidad de trabajar en proyectos internacionales desafiantes, colaborando con colegas altamente capacitados. ¡Nos enfocamos en tu crecimiento profesional!

Postúlate ahora

Envíanos tu CV a

¿Tienes dudas?

Conecta conmigo: Hernán Vietto

Esta es una excelente oportunidad para quienes buscan desafiarse en un entorno dinámico y contribuir a proyectos que hacen una diferencia en la vida de las personas. Si tienes pasión por la tecnología y la salud, ¡sumate a Darwoft!

¡Postulate ahora y sumate a la querida Darwoft!

Correo:

¿Consultas? Sigue al Recruiter: Hernán Vietto

#J-18808-Ljbffr
Lo sentimos, este trabajo no está disponible en su región
Sé el primero en saberlo

Acerca de lo último Database developer Empleos en Argentina !

Ubicaciones cercanas

Otros trabajos cerca de mí

Industria

  1. gavelAdministración Pública
  2. workAdministrativo
  3. ecoAgricultura y Silvicultura
  4. restaurantAlimentos y Restaurantes
  5. apartmentArquitectura
  6. paletteArte y Cultura
  7. diversity_3Asistencia Social
  8. directions_carAutomoción
  9. flight_takeoffAviación
  10. account_balanceBanca y Finanzas
  11. spaBelleza y Bienestar
  12. shopping_bagBienes de consumo masivo (FMCG)
  13. point_of_saleComercial y Ventas
  14. shopping_cartComercio Electrónico y Medios Sociales
  15. shopping_cartCompras
  16. constructionConstrucción
  17. supervisor_accountConsultoría de Gestión
  18. person_searchConsultoría de Selección de Personal
  19. request_quoteContabilidad
  20. brushCreativo y Digital
  21. currency_bitcoinCriptomonedas y Blockchain
  22. health_and_safetyCuidado de la Salud
  23. schoolEducación y Formación
  24. boltEnergía
  25. medical_servicesEnfermería
  26. biotechFarmacéutico
  27. manage_accountsGestión
  28. checklist_rtlGestión de Proyectos
  29. child_friendlyGuarderías y Educación Infantil
  30. local_gas_stationHidrocarburos
  31. beach_accessHostelería y Turismo
  32. codeInformática y Software
  33. foundationIngeniería Civil
  34. electrical_servicesIngeniería Eléctrica
  35. precision_manufacturingIngeniería Industrial
  36. buildIngeniería Mecánica
  37. scienceIngeniería Química
  38. handymanInstalación y Mantenimiento
  39. smart_toyInteligencia Artificial y Tecnologías Emergentes
  40. scienceInvestigación y Desarrollo
  41. gavelLegal
  42. clean_handsLimpieza y Saneamiento
  43. inventory_2Logística y Almacenamiento
  44. factoryManufactura y Producción
  45. campaignMarketing
  46. local_hospitalMedicina
  47. perm_mediaMedios y Relaciones Públicas
  48. constructionMinería
  49. sports_soccerOcio y Deportes
  50. medical_servicesOdontología
  51. schoolPrácticas
  52. emoji_eventsRecién Graduados
  53. groupsRecursos Humanos
  54. securitySeguridad de la Información
  55. local_policeSeguridad y Vigilancia
  56. policySeguros
  57. support_agentServicio al Cliente
  58. home_workServicios Inmobiliarios
  59. diversity_3Servicios Sociales
  60. wifiTelecomunicaciones
  61. psychologyTerapia
  62. local_shippingTransporte
  63. storeVenta al por menor
  64. petsVeterinaria
Ver todo Database Developer Empleos