Data Analytics


Farmbot: Robotics, Analytics, and Sustainability

Over the past year, we have been presented with a unique set of challenges. Living, and working, from home has been a challenge for us all, but it has most effectively stunted research projects. However, this was the perfect opportunity to test a machine meant for such a scenario. Farmbot gives us a sneak peek into the future of farming; fully automated and sustainable. These are imperative steps towards increasing the availability of fresh produce, cutting the effects of climate change, plastic packaging, pesticide use, and carbon emissions that continue to pollute the earth and the food we eat. Even though setting up FarmBot itself proved an arduous task, the final result provides a sustainable farming method allowing for the automated data collection and maintenance of plants.

FarmBot is an autonomous open-source computer numerical control (CNC) farming robot that prioritizes sustainability and optimizes modern farming techniques. Using computer numerical control, FarmBot can accurately and repeatedly conduct experiments with no human input and therefore, very little error. We can write sequences, plan regimens and events to collect data 24/7 in addition to monitoring the system remotely. This allows us to plan as many plants, crops, inputs, and operations as needed. Reducing cost and increasing sustainable farming is a priority of FarmBot.

The FarmBot Genesis features a gantry mounted on tracks attached to the sides of a raised garden bed. The tracks create a great level of precision because the garden bed is represented as a grid in which plant locations and tools can have specific coordinates, therefore allowing endless customization of your garden. The gantry bridges both sides of the track and uses a belt and pulley drive system and V-wheels to move along the X-axis (the tracks) and the Y-axis (the gantry). The cross-slide that controls the Y-axis also utilizes a leadscrew to move the Z-axis extrusion, allowing for up and down movement as well.

Movement is powered by four NEMA 17 stepper motors including rotary encoders to monitor relative motion. The rotary encoders are monitored by a dedicated processor in the custom Farmduino electronics board, which utilizes a Raspberry Pi 3 as the “brain”. The Farmduino v1.5 system has several useful features including built-in stall detection for the motors. Stainless steel and aluminum hardware makes the machine resistant to corrosion, making this system safe for long-term use outdoors.

The FarmBot Genesis model also includes several tools made of UV-resistant ABS plastic that are interchanged using the Universal Tool Mount (UTM) on the Z-axis extrusion. The UTM utilizes 12 electrical connections, three liquid/gas lines, and magnetic coupling to mount and change tools with ease, allowing for automation at nearly every step of the planting and growing process. Sowing seeds without any human intervention is possible with the vacuum-powered seed injector tool, which is compatible with three different-sized needles to accommodate seeds of varying sizes. Once planting has been completed, regimens can be set up to water plants on a schedule using the watering nozzle. The attachment is coupled with a solenoid valve to control the flow of water, ensuring that each plant receives as much moisture as they need. The soil sensor tool takes the automation of watering a step further by detecting the moisture of the soil and using the collected data to modify the amount of water dispensed as needed. A customizable weeding tool uses spikes to push young weeds into the soil before they become an issue. FarmBot uses a built-in waterproof camera to detect weeds and take photos of plants to track growth. All of these tools and features create a completely customizable farming experience without the worry of human error.

FarmBot operates using a 100% open-source operating system and web app. In the web app, we can easily control, configure, create customizable sequences and routines all with a drag-and-drop farm designer and block code format. From here we are able to receive all information regarding the positioning, tools, and plants within the garden bed. FarmBot’s Raspberry Pi uses FarmBot OS to communicate and stay synchronized with the web app allowing it to download and execute scheduled events, be controlled in real-time, and upload logs and sensor data.

We faced many ups and downs during the hardware and software phases of FarmBot. From vague reference docs, manufacturing defects, hardware failures, and network security problems, this was an in-depth and at times very frustrating project. However, we didn’t want an easy project. The challenges we faced when building FarmBot, as annoying it was to debug at the time, helped us gain a great understanding of how this machine works. Through all the blood, sweat, and tears (literally), we learned more from this project than we ever could have imagined. From woodworking to circuity to programming and botany, we tackled a wide range of issues. But that’s what made this project so worth it. Many times we had to resort to out-of-the-box thinking to resolve issues with some of the limited components we had. Working as a team also allowed us to bounce ideas off one another while each bringing our own unique talents to the team. Nicolas has a background in Computer Science which helped with programming FarmBot as well as resolving software and networking issues that occurred. Hannah’s major is Biological Sciences and she has experience with gardening, which helped in the plant science aspect of the project on top of overall planning and construction. As an electrical engineering major, Nicholas helped with the building process, wiring and his input was critical in resolving issues of such an advanced electromechanical system. While we recognize FarmBot’s shortcomings, it was an incredible learning experience and an imperative building block towards a more sustainable and eco-friendly future.

Moving forward, we plan on modifying FarmBot to include a webcam, rain barrel, and solar panels. A webcam will give us a live feed of the FarmBot to allow live remote monitoring as well as enable us to take photos for timelapse photography. Rain barrels can be used to collect rainwater which can be recycled and act as FarmBots water source, further increasing its sustainability. Solar panels will provide a dedicated, off-grid solar energy system helping further reduce the cost and carbon emissions associated with running FarmBot. We also currently have an MIS capstone team developing ways to pull real-time data for future analysis and dataset usage in other academic settings.

Be on the lookout for future FarmBot updates and be free to reach out to opiminnovate@uconn.edu for more information.

By: Nicholas Satta, Hannah Meikle, Nicolas Michel

 



Social Media Integration and Interaction Analysis Using Natural Language Processing

The goal of our project was to implement a score system for Convention Nation, a company that recommends conventions to its users. Scores would be assigned based on level of engagement with Convention Nation’s social media presence. This project combined the principles of gamification with a range of technologies offered at OPIM Innovate. We incorporated IOT in the form of a device that displays Facebook data, AI in the form of sentiment analysis and AR in the form of Splunk’s AR workspaces.

We were able to take data from Facebook using the Graph Application Programming Interface. We then used Natural Language Toolkit, a platform for working with natural language data in Python, to perform sentiment analysis on Facebook comments. For our presentation, we created a Facebook page that visitors could interact with and assigned scores to the administrators of the page. As each page admin made posts and visitors made comments on them, scores changed based on the level of constructive engagement. Splunk, a data analytics platform, provided a series of tools that could be used to view trends in the data and display it in augmented reality. I designed laser cut QR code lapel pins using the laser cutter at the Maker Studio in the library. The Splunk AR workspaces allowed us to scan the pins with an iPad and see our engagement scores appear next to us.

By: Eli Udler

 


An Introduction to Industrial IoT

This past semester, I participated in OPIM 4895: An Introduction to Industrial IoT, a course that brings data analytics and Splunk technology to the University’s Spring Valley student farm. In this course, I learned how to deploy sensors and data analytics to monitor real-time conditions in the greenhouse in order to practice sustainable farming and aquaponics. The sensors tracked data for pH, oxygen levels, water temperature, and air temperature which was then analyzed through Splunk. At the greenhouse, we were able to visualize the results of this data in real time at the greenhouse using an augmented reality system with QR codes through Splunk technology. We were also able to monitor this data remotely through Apple TV Dashboards in the OPIM Innovate Lab on campus.

As a senior who is graduating this upcoming December, I appreciated the opportunity to have hands on experience working with emerging technology. Learning tangible skills is critical to students who plan on entering the workforce, especially in the world of technology. The Industrial IoT course has been one of my favorite courses of my undergraduate career as a student at UConn. I believe this is largely because it has significantly strengthened my technical skills through interactive learning, working closely with other students and faculty, and traveling on-site to the greenhouse. Using Splunk to analyze our own data that was produced by sensors that we deployed at the farm is a perfect example of experiential learning.

By: Radhika Kanaskar 


Innovate Taking Steps Towards Splunk

After taking OPIM 3222 I learned about this really cool big data platform called Splunk. I learned that the big selling point of Splunk is that it can take any machine data and restructure it so that it is actually useful. So after I got a Fitbit Charge HR my first thought was, “I wonder what we can find if we Splunk this.” I worked with Professor Ryan O’Connor on this project, and after several hiccups, we finally got a working add-on. Back when we first started this project we found a “Fitbit Add-On” in Splunkbase that we just had to install and then we would be ready to go. After spending a lot of time trying to get this add-on set up we learned that it was a bit outdated and had some bugs that were making it quite difficult to use. After a while this project got pushed off to the side as we worked on other IoT related projects in the OPIM Gladstein Lab.

By the time another spark of inspiration came along, the Splunk add-on was gone because of its age and bugs. Since the add-on was gone we had to take matters into our own hands. Professor O’Connor and I split the work so that I would work on using the Fitbit API to pull data and he would then work on putting it into Splunk. I wrote Python scripts to collect data on steps, sleep, and heart rate levels. We then found that the Fitbit API required OAuth2 authentication every few hours to be able to continuously pull data. Professor O’Connor already tackled a similar issue when making his Nest add-on for Splunk. He used a Lambda function from Amazon Web Services to process this OAuth2 request. We decided to use the same function for the Fitbit, but the major difference is that the function is called every few hours. Professor O’Connor then made a simple interface for users to get their API tokens and setup the add-on in minutes. We then took a look at all of the data we had and decided the best thing that we could make was a step challenge. We invited our friends and family to track their steps with us and create a dashboard to summarize activity levels, create an all-time leaderboard, and visualize steps taken over the course of the day. However, this app only scratches the surface of what can be done with Fitbit data. The possibilities are endless from both a business and research perspective. We have already gotten a lot of support from the Splunk developer community and we are excited to see what people can do with this add-on.

By: Tyler LaurettiSenior MIS Major


Introducing Alexa to Splunk

Before the introduction of Apple’s “Siri” in 2010, Artificial Intelligence voice assistants were no more than science fiction. Fast forward to today, and you will find them everywhere from in your phone helping you navigate your contacts and calendar, to in your home helping you around the house. Each smart assistant has its pros and cons, and everyone has their favorite assistant. Over the last few years I have really enjoyed working with Amazon’s Alexa smart assistant. I began working with Alexa during my summer internship at Travelers in 2016. I attended a “build night” after work where we learned how to start developing with Amazon Web Services and the Alexa platform. Since then, I’ve developed six different skills and received Amazon hoodies, t-shirts, and Echo Dots for my work.


So I mentioned “skills”, but I didn’t really explain them. An Alexa “skill” is like an app on your phone. These skills use the natural language processing and other capabilities of the Amazon “Lex” platform to do whatever you can think of. Some of the skills I have made in the past include a guide to West Hartford, a UConn fact generator, and a quiz to determine if you should order pizza or wings. However, while working in the OPIM Innovate lab I have found some other uses for Alexa that go beyond novelties. The first was using Alexa to query our Splunk instance. Lucky for us, the Splunk developer community has already done a lot of the leg work for us. The “Talk to Splunk with Amazon Alexa” Splunk add-on handles most of the networking between the Alexa and your Splunk instance. In order for Alexa Splunk to securely communicate we had to set up a private key, a self-signed certificate, and a Java keystore. After doing a few basic configuration steps on the Splunk side, you can start creating your Splunk Alexa Skill. This skill will be configured to talk to your splunk instance, but it is up to you to determine what queries to run. You can create “Intents” that will take an english phrase and convert that to a splunk query that you write. However, you can also use this framework to make Alexa do things like play sounds or say anything you want. For example, I used the Splunk skill we created to do an “interview” with Alexa about Bank of America’s business strategy for my management class. Below you can find links to the Alexa Add-On for Splunk as well as a video of that “interview”.

By: Tyler LaurettiSenior MIS Major

 

Talk to Splunk with Amazon Alexa: https://splunkbase.splunk.com/app/3298/

Alexa Talks Strategy

https://www.youtube.com/watch?v=UYaV8ybYV04