Activities
Data analyst for the National Buoy Program (PNBOIA). Currently working on:
- database maintenance and modelling with PostgreSQL;
- developing the quality control of data from myriad sensors installed on operational platforms such as metoceanographic buoys and autonomous underwater vehicles (AUVs);
- validating the availability of said data to clients through API's.
Skills
Data Analyst with a background in Oceanography.
- Academic background with a focus on the use of remote sensing to study Geophysical Fluid Dynamics.
- Exchange period at the University of Victoria as a CNPq scholarship holder through the Science Without Borders program.
- Experience with metocean data (altimetry, SST, salinity, ocean colour, wind) and oceanographic instrumentation (ice profiling sonars, acoustic zooplankton and fish profilers, CTDs, etc.), as well as with numerical models for global oceanic circulation (ROMS, HYCOM).
Started focusing on data analysis after completing Ironhack’s intensive Data Analytics bootcamp. Proficiency in SQL (MySQL/PostgreSQL), ETL and data pipelines, data viz with Power BI and some Tableau, APIs for data handling (and for some web scraping/data mining) and machine learning. My Python toolkit now includes libraries like Pandas, Matplotlib, Seaborn, Numpy, Scikit-learn, SciPy, NetCDF, Xarray, BeautifulSoup, Selenium, Dask, and SQLAlchemy. Latest tools I've learned are focused on data engineering: Docker, Kubernetes, Spark, Apache Airflow.
Comment(s)
Got acquainted with OceanExpert through the Ocean Best Practices Self-Paced Course, which aims to introduce students to the concept of ocean best practices, the OBPS programme and its Repository