Neptune for MLOps: Better Data Science Tools

Reading time:
time
min
By:
Gaja Klaudel
February 3, 2022

Neptune is an online tool for monitoring and storing your data science experiments in an orderly manner. It opens up easier avenues of monitoring and control for your MLOps, allowing you to be more productive with ML engineering and research. It can be especially useful while working concurrently on several projects and testing different model settings. Neptune helps log diverse types of data concerning your models’ training and visualize experiment metadata as soon as training starts. With Neptune, you can organize results according to your specific needs and can compare multiple runs. Table of contents: <ul><li><a href="#setup">How to set up Neptune</a></li><li><a href="#logging">What can you log in Neptune</a></li><li><a href="#runs">How to distinguish and compare runs</a></li><li><a href="#experience">My experience with Neptune for MLOps - Pros and Cons</a></li><li><a href="#conclusion">Summary</a></li></ul> <hr /> <h2 id="setup"><b>How to set up Neptune for MLOps</b></h2> You can set up <a href="https://neptune.ai/" target="_blank" rel="noopener noreferrer">Neptune</a> quickly within 6 short steps: <ol><li aria-level="1">Register & log in</li><li style="font-weight: 400;" aria-level="1">Set your project</li><li style="font-weight: 400;" aria-level="1">Install Python 3.x</li><li style="font-weight: 400;" aria-level="1">Install Neptune: pip install neptune-client</li><li style="font-weight: 400;" aria-level="1">Include other dependencies (e.g., for fastai: pip install neptune-fastai)</li><li style="font-weight: 400;" aria-level="1">Connect the script to the Neptune logging board</li></ol> <blockquote><strong>Access the largest Computer Vision models library with <a href="https://appsilon.com/timm-with-fastai/" target="_blank" rel="noopener noreferrer">PyTorch Image Model with fastai</a>.</strong></blockquote> Note: you don’t have to add the api_token option if you added it already to your .bashrc file as the NEPTUNE_API_TOKEN variable. <img class="alignnone size-full wp-image-12068 aligncenter" src="https://webflow-prod-assets.s3.amazonaws.com/6525256482c9e9a06c7a9d3c%2F65b01efb487b2c34e796afed_Neptune-API-Token-example.webp" alt="" width="390" height="76" /> &nbsp; <h2 id="logging"><b>What can you log in Neptune</b></h2> After a proper setup, you can begin logging project data. In the Appsilon Computer Vision and ML team, we use Neptune where we want to monitor all our models’ variations, their results, and code versions related to each run. It helps our team maintain our models in production reliably and efficiently. But how do we log such data? <blockquote><strong>Keeping an eye out for Object Detection algorithms? See our <a href="https://appsilon.com/object-detection-yolo-algorithm/" target="_blank" rel="noopener noreferrer">introduction to YOLO Object Detection</a>.</strong></blockquote> Well, if you have your script connected to Neptune you only need a simple, 1-liner command for every type of data. <h3><b>Metrics / losses</b></h3> <script src="https://gist.github.com/MicahAppsilon/97c4be1c65e436c37ad2a7f6a5fc1858.js"></script> &nbsp; <img class="alignnone size-full wp-image-12066 aligncenter" src="https://webflow-prod-assets.s3.amazonaws.com/6525256482c9e9a06c7a9d3c%2F65b01efcfeabe8f16c328a9b_Metrics-log.webp" alt="" width="1600" height="530" /> <h3><b>Events files/model checkpoints - created during training</b></h3> <script src="https://gist.github.com/MicahAppsilon/0bfb80c6a3d2556e773ecc3db0cb2e19.js"></script> &nbsp; <img class="alignnone size-full wp-image-12064 aligncenter" src="https://webflow-prod-assets.s3.amazonaws.com/6525256482c9e9a06c7a9d3c%2F65b01efda8912184dd39e52a_Metadata-events-files-and-checkpoints.webp" alt="" width="855" height="536" /> <h3><b>Images &amp; other files</b></h3> You can log various model-building metadata types with Neptune. Here's a list of a few supported file types: <ul><li>Standard image formats - png, jpg, gif</li><li>Matplotlib figures</li><li>PIL images</li><li>NumPy arrays</li><li>Tensorflow tensors</li><li>Pytorch tensors</li></ul> <script src="https://gist.github.com/MicahAppsilon/2f5f06b6c27d41c5ad781f42211fb3d3.js"></script> Note: the difference between “log” and “upload” methods is that you 'upload' only one image, and 'log' a series of images. <img class="alignnone size-full wp-image-12062 aligncenter" src="https://webflow-prod-assets.s3.amazonaws.com/6525256482c9e9a06c7a9d3c%2F65b01efeb67e7dee63781458_Confusion-matrix.webp" alt="" width="1600" height="829" /> &nbsp; <h3><b>Files, scripts</b></h3> <script src="https://gist.github.com/MicahAppsilon/f7acb54dabffec206d72493ed654b560.js"></script> Neptune also logs some basic info about every run automatically: <ul><li style="font-weight: 400;" aria-level="1">System information (creation time, hostname, id, owner, running time, etc.)</li><li style="font-weight: 400;" aria-level="1">Hardware consumption and console logs (CPU/GPU monitoring, error, and output logs from your console)</li><li style="font-weight: 400;" aria-level="1">Git information tied with your script</li></ul> Besides that, you can log also: <ul><li style="font-weight: 400;" aria-level="1">Model hyperparameters / configurations</li><li style="font-weight: 400;" aria-level="1">Notebook code snapshot -  every time you run your code in Jupyter Notebook!</li><li style="font-weight: 400;" aria-level="1">Data versions</li><li style="font-weight: 400;" aria-level="1">DVC files</li><li style="font-weight: 400;" aria-level="1">Python’s logger logs</li><li style="font-weight: 400;" aria-level="1">Interactive visualizations (HTML files, bokeh plots, plotly plots)</li><li style="font-weight: 400;" aria-level="1">Descriptions of the given run</li><li style="font-weight: 400;" aria-level="1">Tags tied with a given run</li><li style="font-weight: 400;" aria-level="1">Video files</li><li style="font-weight: 400;" aria-level="1">Audio files</li></ul> <h2 id="runs"><b>How to distinguish and compare runs on Neptune</b></h2> <h3><b>Runs table</b></h3> The basic type of view in Neptune is the runs table. Runs table stores rows describing every run in start time order. Like almost everything in Neptune, the table can be easily customized - a user can add/delete or change its columns and tags. You can also search or filter your runs and group them by customizable parameters. <img class="alignnone size-full wp-image-12072 aligncenter" src="https://webflow-prod-assets.s3.amazonaws.com/6525256482c9e9a06c7a9d3c%2F65b01f0046af19f939a8225d_runs-table.webp" alt="" width="1600" height="540" /> &nbsp; <h3><b>Compare runs tab</b></h3> Choose runs that you want to compare by clicking on the eye icon in the runs table. In doing so, you can see the comparison results in the compare runs tab. The tab is located on the left part of the screen. Here you can see charts, parallel coordinates, and side-by-side parameter comparisons. Besides that, Neptune is creating some auto-generated comparison dashboards for your runs. Neptune is a very customizable tool, so you can also create your own dashboards and save them for later! <img class="size-full wp-image-12060 aligncenter" src="https://webflow-prod-assets.s3.amazonaws.com/6525256482c9e9a06c7a9d3c%2F65b01f01ee976c4ea991705a_compare-runs-tab.webp" alt="" width="1600" height="715" /> &nbsp; <h3><b>Horizontal split tab of runs table and runs comparisons</b></h3> This is the type of view, where your screen is split between the runs table and comparisons data. It's convenient if you need to view them at the same time. <img class="alignnone size-full wp-image-12074 aligncenter" src="https://webflow-prod-assets.s3.amazonaws.com/6525256482c9e9a06c7a9d3c%2F65b01f0304abfeee1f057021_split-runs-table-and-comparison.webp" alt="" width="1600" height="780" /> &nbsp; <h2 id="experience"><b>My experience with Neptune for MLOps - pros &amp; cons</b></h2> I have to admit that at first glance, I wasn't a fan of the Neptune tool. I was using Tensorboard for a very long time and I couldn't imagine using another tool for experiment tracking purposes. But the convenience of Neptune changed my opinion on this. I was able to set up and start logging data in just a few minutes. This without knowing the tool, and reading only a couple of lines from clean, concise documentation. <blockquote><strong>See how Appsilon uses <a href="https://appsilon.com/gabon-wildlife-ai-for-biodiversity-conservation/" target="_blank" rel="noopener noreferrer">AI for biodiversity conservation</a>. </strong></blockquote> <h3>Neptune vs Tensorboard for MLOps</h3> Moreover, Neptune is a little bit different from the popular Tensorboard. Its application is not only for tracking and comparing experiments during training but also for storing every piece of info and data concerning a given run, which might be useful and crucial in the future. Neptune is also very customizable and the user gets to decide how to use its options and capabilities. For me, the biggest advantage of Neptune is that it's not dependent on some other libraries and modules. It's a self-standing, easy-to-setup tool. <h3><b>Pros</b></h3><ul><li style="font-weight: 400;" aria-level="1">Easy to set up and start using compared with, for example, Tensorboard</li><li style="font-weight: 400;" aria-level="1">Properly written with clear, concise documentation</li><li style="font-weight: 400;" aria-level="1">Offline version</li><li style="font-weight: 400;" aria-level="1">Customizable comparisons, runs table</li><li style="font-weight: 400;" aria-level="1">Search and filtering options</li><li style="font-weight: 400;" aria-level="1">With Neptune, one can create a “backup” for all experiments data in one place, in order to restore them one day</li><li style="font-weight: 400;" aria-level="1">Trash option - from where deleted runs can be easily restored back to the runs table</li><li style="font-weight: 400;" aria-level="1">Fast communication & support from the Neptune side</li><li style="font-weight: 400;" aria-level="1">Possibility to track whole Jupyter notebooks</li><li style="font-weight: 400;" aria-level="1">Independent of any other modules</li><li style="font-weight: 400;" aria-level="1">Easy to share with someone outside of the project, by link</li><li style="font-weight: 400;" aria-level="1">Not only for Python but also for R users</li><li style="font-weight: 400;" aria-level="1">Integration options, for example with fastai, but it’s a material for another blogpost</li><li style="font-weight: 400;" aria-level="1">You can watch your training even if you’re not next to the computer - for example, from the browser on your mobile phone ;)</li></ul> <h3><b>Cons</b></h3><ul><li style="font-weight: 400;" aria-level="1">The user needs to take care of synchronization between offline and online versions manually.</li><li style="font-weight: 400;" aria-level="1">Not fully open-source, an individual version probably would be enough for private usage, but such access has per-month limits.</li><li style="font-weight: 400;" aria-level="1">Runs comparisons could be more intuitive (I had to check in the documentation how exactly I should do it).</li><li style="font-weight: 400;" aria-level="1">Some minor design issues are present, e.g., when comparing runs, the on-hover display of run ids fails when the ids are too long. </li></ul> <h2 id="conclusion"><b>Summary</b></h2> Neptune is an excellent data science tool. It can be used more easily for experiment tracking and storing than other, similar tools, like Tensorboard. I recommend testing it out and seeing how you can improve your projects with Neptune for MLOps!

Have questions or insights?

Engage with experts, share ideas and take your data journey to the next level!

Is Your Software GxP Compliant?

Download a checklist designed for clinical managers in data departments to make sure that software meets requirements for FDA and EMA submissions.
Explore Possibilities

Share Your Data Goals with Us

From advanced analytics to platform development and pharma consulting, we craft solutions tailored to your needs.

Talk to our Experts
data visualization
ai&research