Devmoot 2023

 Devmoot happened this past Friday at Relix, right up the road from my house. It’s a tech conference sort of akin to Codestock just a bit smaller, happening on a single day and on a single-course track (speakers all lined up in one room, one audience rather than the mix/match style of Codestock). Tech conferences can generate such inspiration and connection amongst people who have a passion for technology and code in particular. It’s a tradition in tech that goes all the way back to the computer building hobbyists who would hold meetups back in the 70s. It can definitely also tie back into your career but I think the essence of a tech conference is always curiosity – the desire to know, tinker and create. Those virtues hold inherent value beyond a career. 

The topics at Devmoot had a general theme around the relevant current topics of AI and machine learning, specifically Chat GPT. Jeff Prosise was the keynote speaker and gave a great opening talk about it. But first, some clarification of definitions:

“Artificial Intelligence (AI) is an umbrella term for computer software that mimics human cognition in order to perform complex tasks and learn from them. Machine Learning (ML) is a subfield of AI that uses algorithms trained on data to produce adaptable models that can perform a variety of complex tasks.”

(quote here)

These technologies grew out of their predecessor, Large Language Models (LLM), ala Google Translate which uses a Recurrent Neural Network (RNN). A RNN processes language sequentially, only really doing small chunks of code well, bigger stuff it chokes on. Enter the Transformer Coder/Decoder  – which can process large chunks of code all at once (non-sequentially). It’s a “next-word predictor” at it’s core and can be trained to become better and better at predicting and even “understanding” (like different uses of a single word – “park” for example). The “Bert” AI was built using TCD (on top of Google Translate) and began the AI craze (at least according to my notes). And from this, a company called Open AI developed GPT1 (“Generative Pre-Training Transformer”) – eventually GPT3 would give rise to Chat GPT which was essentially trained on a snapshot of the internet as a whole (via that “machine learning” thing). That’s a lot of data…but it’s reaching the end, and hungers for more!

There were followup talks by others that drilled into some of the interesting ramifications of this new technology. How do you navigate the legality of intellectual property where an AI consumes content (i.e. works from humans) and then uses that raw material to generate “new” content? How new is it? Is that different from when people do the same thing (Tolkien’s work borrowed heavily from Norse mythology, then Tolkien’s work was mined by practically every fantasy work since)? Amy Fletcher gave an interesting talk about ‘disruptive tech’ and how AI technology could be another game changer in that regard. Not just for the tech industry but modern society as a whole. 


Posted

in

by

Tags: