Empowering Creators with Open Source Software

As we welcome another academic year, the integration of creative arts within school curriculums remains vital. Among open source resources that empower students to engage with digital soundscapes is Audacity, a free and versatile audio recording and editing software cherished by educators for its simplicity and power in the classroom setting. Audacity’s capacity has only grown, making it an indispensable tool not just today but as we look ahead to 2025 with ever-evolving educational needs:

Podcasting Platform of Choice: Connectivity through Sound
Teachers and students alike have adopted Audacity for crafting podcasts, serving an array of purposes from explaining classroom procedures directly within lessons to delivering language learning content. This interactive form has become a cornerstone in modern pedagogy by facilitating out-of-class communication that supplement the traditional teaching experience and offering students additional access points into course material through auditory means, which can enhance comprehension for many learners.

Language Acquisition with Audio Engagement: Learning Languages Through Listening
For language education, Audacity has been transformative by providing a platform where foreign language pupils record their spoken lessons and listen to them repeatedly—all within the safety net of open source software that champions accessibility for all students. This feature nurtures self-directed learning as well as peer interaction in multi-language classrooms, setting up an immersive auditory environment akin to real-world conversational scenarios.

Creative Expression Through Sound: Student Audio Projects Evolving with Time and Technology
Students’ love for sound extends beyond passive listening; they are creators in their own right using Audacity to produce unique audio projects such as bird songs, oceanic ambient tracks, or even creating custom narrations over chosen background music. This engagement stimulates imagination while providing a practical understanding of digital tools and copyright laws through exploring resources from Creative Commons and Wikimedia sound collections—a learning process that teaches respect for intellectual property alongside technical skills in audio manipulation.

Interviews as Interactive Learning: Engaging with Experts Through Sound Waves
Audacity allows students to conduct interviews, integrating them into their educational activities by adding layers of personal experience and expert insight directly through the auditory channel—a method that not only humanizes learning but also bridges generations within a classroom setting as older family members share experiences with younger ones. This formative approach promotes active listening skills while fostering familial bonds, an essential lesson beyond academics alone.

From Capture to Share: Effective Audio File Management for the Modern Classroom Stage and Beyond (2024 Edition)
Education today is not just about content but also delivery methodologies—therefore Audacity’s importance as a tool in helping students understand how different audio file formats serve various platforms. From .aup files that facilitate ongoing educational collaboration, to MP3 and WAV for final projects suitable for wider sharing via streaming web servers or digital portfolios, the software prepares young minds not only with technical skills but also industry standards they will encounter in professional spheres such as podcasting careers.

Open Source Software: A Lesson on Rights (2024 Update) and Legacy of Ubuntu’s Free Audio Education Toolkit
With its GNU GPLv2 license, Audacity is more than a mere software—it’s an educational journey itself with room for dialogue about copyright laws. This invites students into the world of intellectual property rights discussions that are increasingly relevant in our digital age and offers Linux users straightforward installation processes through standard repositories:

$ sudo apt-get install audacity
or with Fedora
$sudo dnf install audacity 

The software continues to stand its ground against the backdrop of continually developing technology with instructions provided for Mac OS X and Windows users ensuring no one is left behind in leveraging this educational powerhouse. Audacity is also available for Linux users as a Flatpak.

According to Wikipedia, “Audacity is the most popular download at FossHub,] with over 114.2 million downloads since March 2015.” Thus as we advance into 2025 and beyond, Audacity remains at the forefront of integrating creativity with digital sound technologies to enrich our classrooms while providing essential open source knowledge sharing that prepares students for a connected world where audio artistry goes hand-in-hand alongside academic excellence.

This article was adapted and rewritten using Ollama and the Phi3.5 model. Text was taken from an article originally published for Opensource.com in 2016.

Taking a look at financial data with Ollama

Several weeks ago a person asked me to assist her with organizing her financial records to take them to a tax professional. This person does not use a financial program like GnuCash which could make that project much easier. Instead we downloaded a csv file from her bank and then she used Microsoft Excel to add a category to each expense. This was a tedious process. I used a pivot table to further organize and display her data which she took to the tax preparer.

Recently while working on other projects with Ollama I wondered if it might be possible to use a local large language model to accomplish the same task. It is easy to download Ollama. If you are a LInux user like I am you can enter the following command in a terminal.

curl -fsSL https://ollama.com/install.sh | sh

I experimented with phi3.5 and Llama3.2 and found the latter to work better for me. It is easy to pull the model down to your computer with the following command:

$ ollama pull Llama3.2

Once the model was downloaded to my computer I wanted to make my own custom model to analyze my financial data set which was a csv file from the bank. I created a model file which I called financial using nano. Here is the text of the modelfile I created for this activity:

FROM llama3.2

# set the temperature to 1 [higher is more creative, lower is more coherent]

PARAMETER temperature .6

# set the system message

SYSTEM “””

You are a financial analyst. Analyze the financial information I supply.

“””

I used the model file to to create the custom model for this financial analysis. I set the temperature PARAMETER to .6 to make the work more accurate. I entered the following command in the terminal:

$ ollama create financial -f financial

This created the unique LLM based on Llama3.2 to perform the financial analysis. I made sure that the csv file from my financial institution was in the same directory as I was currently operating. This is important and then entered the following command to pull the csv file into the custom LLM.

ollama run financial:latest "$(cat data.csv)", Summarize my transactions. 

This gave me a complete summary of the debits and credits that were included in the small csv file. I have encountered some errors and I plan to keep working with the model and reading. I’m encoueraged by the results.

In search of the right GPU

I rely on my powerful Intel NUC with an i7 processor and 64 GB of RAM for my daily computing needs. However, it lacks a GPU, which makes it unsuitable for the experimentation I’ve been conducting with locally hosted large language models. To address this, I use an M2 MacBook Air, which has the necessary power for some of these tasks.

I had helped some local folks purchase a refurbished Dell computer from a refurbisher. They began to experience difficulty with it in a couple of months and when they did it was beyond the ninety day warranty. Rather than see them lose their money I wrote them a check for the original purchase price.

I believe that when you do good things that you will be rewarded in some fashion. I helped these folks purchase a new Dell Inspiron desktop which has a full factory warranty and when I was about to leave their home they asked me if I wanted to take the defective computer. I thought I might be able to fix it or use it for parts. I removed the cover and discovered that this Optiplex 5060 with an i5 CPU didn’t have a traditional hard drive like I had thought but instead was equipped with a Western Digital SN 270 NVME drive. I also discovered that the only thing wrong with the unit was a bad external power switch. Once I removed the front bezel I was easily able to power the device on.

Karma was working once again in my favor as I have found it does when you do for others as youu would have them do for you. I erased the Windows 11 install and installed Linux Mint 22 in it’s place. This unit also had two open low profile expansion slots and I wondered if I could find a graphics card with a GPU that would allow me to experiment with Ollama and other LLMs. I did some research and decided to purchase a XFX Speedster SWFT105 Radeon RX 6400 Gaming Graphics Card with 4GB from Amazon. The card came a couple days later and I installed it in one of the expansion slots.

After installing the card I placed the cover back on the machine, connected a spare Sceptre 27 inch display and an ethernet cable to it and downloaded Ollama and the Phi3 model. I downloaded and installed the ROCm modules which are helped Ollama to recognize the GPU. Ollama states that it recognizes the GPU when it finished installing the software. I think Ollama and the Phi3 module run faster with this unit. But maybe that’s wishful thinking. I also wanted to try Stable Diffusion on this computer and used Easy Diffusion which I have installed on the NUC before. I was frustrated to discover that my RX6400 card and GPU don’t work with EasyDiffusion. Am I missing something? Is there a fix?

I hope that if you’re reading this and you know of a fix for this issue that you would share it. I’d love to find and answer. Nonetheless, doing good for others always results in good coming back to you.

Pandoc: The Ultimate Document Conversion Tool

Pandoc is a versatile command-line tool that facilitates seamless file conversions between different markup formats. It supports an extensive range of input and output formats, making it indispensable for writers, researchers, and developers. I have found it particularly useful when converting output from LLMs to to HTML or more common word processing formats.

Pandoc’s strength lies in its support for various input formats, including Markdown, HTML, LaTeX, Open Document, and Microsoft Word. It can convert those documents to PDF, HTML, EPUB, and even PowerPoint presentations. This flexibility makes Pandoc an invaluable tool for individuals working with documents across different platforms and tools.

Here are some specific examples that may fit your use case.

1. Converting Markdown to HTML:

Markdown, known for its simplicity and readability, is widely used for creating content for the web. With Pandoc, you can effortlessly convert Markdown files to HTML, enabling seamless web content publishing. For instance, the following command can be used to convert a Markdown file named “example. md” to HTML:

$ pandoc example.md -o example.html

2. Generating PDF from LaTeX:

LaTeX, renowned for its powerful typesetting capabilities, is favored for academic and technical documents. Pandoc seamlessly converts LaTeX files to PDF, producing high-quality documents suitable for printing or digital distribution. Consider the following command to convert a LaTeX file named “paper.tex” to PDF:

$ pandoc paper.tex -o paper.pdf

3. Transforming Word documents to Markdown:

Many writers and researchers prefer working with Markdown due to its simplicity and portability. With Pandoc, you can convert Microsoft Word documents to Markdown, allowing editing and collaboration using lightweight, text-based tools. Use the following command to convert a Word document named “report.docx” to Markdown:

$ pandoc report.docx -o report.md

4. Creating EPUB from HTML:

EPUB, a popular e-book format compatible with a wide range of e-readers and mobile devices, is a common choice for digital content distribution. If you have content in HTML format, Pandoc can assist in converting it to EPUB for convenient distribution and reading. Here’s an example command to convert an HTML file named “book.html” to EPUB:

$ pandoc book.html -o book.epub

5. Convert Markdown file to a PowerPoint presentation using Pandoc, you can use the command

$ pandoc myslides.md -o myslides.pptx

You can open the resulting .pptx file in PowerPoint

In addition to these examples, Pandoc offers extensive customization options for fine-tuning the output of document conversions. Users can specify styling, metadata, and other parameters to ensure the converted files meet their specific requirements.

In conclusion, Pandoc stands as a robust and versatile tool for document conversion, offering support for a wide array of input and output formats. Pandoc can help streamline your workflow and enhance your document management capabilities, whether you’re a writer, researcher, or developer.

Seamlessly Convert Your Markdown Docs to Office Formats Using Pandoc

I have been experimenting a lot with Ollama and other artificial intelligence tools and the anwers to my prompts always are rendered in Markdown. I have Marktext on my Linux computer and MacDown on my MacBook Air so I can easily copy and paste the output into either of those editors and save it as a Markdown file on my computer. However, when I want to share those files with colleagues who are unfamiliar with Markdown I need a way to convert those files into a format that’s easily accessible for them. My Markdown editors can only export the Markdown files as HTML or PDF.

That problem is easily solved with Pandoc which is a great tool that anyone can install on Linux, MacOS or Windows that easily converts Markdown into any number of different formats. Easily install Pandoc on Linux with the following commands:

$ sudo dnf install pandoc
or 
$ sudo apt install pandoc

On MacOS use Homebrew to easily install Pandoc by opening a terminal and entering the following command.

$ brew install pandoc

You can install Pandoc on Microsoft Windows using Chocolatey with the following command:

choco install pandoc

Once the application is installed it works very well from the command line. The project mainains great documentation. I needed to convert a Markdown document to .docx so my Windows using colleagues could easily read the output from the LLM I had been using. I used the documentation and the programs man page on my Linux system to enter the following command.

$ pandoc -o ProposedLease.docx -f markdown -t docx LeaseReview.md

The conversion was flawless and occured in a second or two. Your experience may vary based on CPU, RAM and length of the document converted. The Markdown file of our lease review was converted to “ProposedLease.docx” which I could easily share with my colleagues who were using Microsoft Word.

If you are a person who is uncomfortable installing software on your computer or you are an unpriviledged user in a corporate or educational setting you can use Pandoc online. Pandooc is open source with a GPL v2 license. For more information about Pandoc be sure to consult their website and examine all of the options that the program provides you.

Evaluating writing using open source artificial intelligence

In today’s digital age, writers seek tools that enhance their craft and provide real-time feedback and assistance. Enter Ollama – an open-source machine learning system designed to democratize accessibility for natural language processing tasks across a wide range of languages and scripts with ease. Coupled with the Phi3 model, this powerful duo promises unparalleled benefits in refining your writing style through sophisticated evaluations beyond grammar checking. This article will explore how Ollama, integrated with phi3’s innovative language comprehension and assessment approach, revolutionizes the writer’s journey toward excellence. So buckle up as we embark on a transformative exploration of your writing potential using these advanced AI tools!

I have been experimenting with Ollama and various models designed to work with it. In this article, I wanted to use Ollama and Phi3 to demonstrate the power of open source to revolutionize and evaluate writing. Both Ollama and Phi3 are open source, both have MIT licenses, and they work well together. You need to have Ollama installed on your computer. My daily driver is an Intel NUC 11 with an i7 processor running Linux Mint Cinnamon. Though this is a powerful computer, running a large language model takes more than a great CPU. A powerful GPU would make the process a little quicker.

First, I needed to make sure that Ollama was installed. If you are on Linux, you do that by executing the following code in a terminal.

$ curl -fsSL https://ollama.com/install.sh | sh

The installation is very straightforward and then you will need to download the Phi3 model. Enter the following command into your terminal.

$ ollama pull phi3

This pulls the Phi3 model down to your computer. You can check to make sure it is there by entering the following code.

$ ollama list

You should see a list of all the models loaded on your computer.

NAME            	ID          	SIZE  	MODIFIED       
phi3:latest     	4f2222927938	2.2 GB	56 minutes ago	
cooking:latest  	59c8496c62aa	4.7 GB	7 days ago    	
llama3.1:latest 	42182419e950	4.7 GB	7 days ago    	

Now I am ready to begin. I wanted Ollama and the Phi3 model to develop a rubric to evaluate my writing sample. I need to enter two commands. First, I need to get the LLM running, which will open a REPL (Read Only Print Loop) prompt very similar to a Python REPL.

ollama run phi3:latest
>>> create a rubric to score college level writing.  

The model quickly created a rubric to score my writing some of which I have copied and pasted here:

“College Level Writing Rubric: Essay (20 points)

I. Introduction and Thesis Statement (4/20 = 2 points):

A student’s introduction must be clear, concise, engaging, and
well-structured. A strong thesis statement should identify the topic of
discussion clearly while presenting an argument or perspective that sets
up a logical progression throughout the essay. Students will lose one
point for each instance where these two components fail to establish
context effectively (e.g., unclear writing style, unconvincing tone).

1 Point – Clear and engaging introduction with well-structured paragraphs setting the stage of discussion or argumentation


2 Points – Introduction is structured clearly but fails in some areas such as clarity of language use or lacking an enticing hook to captivate readers’ interest. (Lose one point)


1 Point – Thesis statement presented, but lacks precision and clear articulation about the topic discussed; does not establish a logical thread for essay progression

2 Points – Clearly stated thesis that presents argument or perspective while setting up an effective structure to follow throughout. (Lose one point)

0 PointS – Introduction is unclear, disjointed language use fails in
providing context and lacks persuasive tone necessary; no clear indication
of what the essay will discuss

The rubric, over seven hundred words long, was generated in a few minutes. All text output from Ollama is in MarkDown. The rubric will be rendered much quicker if you are a developer using an Apple Silicon MacBook and an M2 processor. Though exceptionally powerful, the NUC 11 with the i7 lacks a GPU, which eases the strain on the CPU.

Now that the rubric has been created, I entered the following command in the Ollama REPL to evaluate an article I wrote for Opensource.com nearly ten years ago.

>>>Use the rubric to evaluate this article https://opensource.com/education/15
... /12/my-open-source-story-don-watkins

The process took a few minutes and provided an in-depth review of my article. The evaluation was very detailed and over nine hundred words long. I have pasted the first part of the review here.

“2 Points – Introduction engages with opening hook; however, lacks clarity in setting the stage and doesn’t align well with Watkins’ thesis statement (lose up to three points) 1 Point – Thesis statement present but vague or imprecise about what readers should expect throughout this article. Lacks clear alignment between I and II components of essay-like structure; no explicit roadmap provided for reader follow along (lose two maximum points); fails in captivating the audience right from introduction


0 PointS – Introduction lacks coherence, disjointed language use provides little context or interest to readers about open source contributions. No engaging hook presented nor clear alignment between I and II components of essay-like structure; does not provide explicit roadmap for reader follow along (lose one point)…”

Using large language models to assess writing could offer the subtlety writers require to enhance their writing. Are there potential issues? Will artificial intelligence models replace copywriters? What other implications might they have that change how we write and re-write?

Using Ollama to write an evaluation rubric

In my previous article, I discussed using Ollama and the Llama3.1 model to assess my writing. Since I’m new to teaching writing, I was curious to see if Ollama could generate an evaluation rubric for grading student writing and then use that rubric to evaluate a sample of student writing.

I am going to assume that you have already installed Ollama on your computer and downloaded the Llama3.1 model. If not, be sure to follow the directions provided in the previous article. Start Ollama and run the model:

$ ollama run llama3.1:latest

>>>Develop a assesment rubric for the evaluation of undergraduate student writing

Here is the assessment rubric that was created by Ollama and Llama3.1:

Content (40 points)

  • Thesis Statement: Does the thesis statement clearly and concisely present the main argument or claim? (5 points)
    • 4: The thesis statement is clear, concise, and well-supported by evide evidence.
    • 3: The thesis statement is mostly clear and concise, but may lack some supporting evidence.
    • 2: The thesis statement is somewhat unclear or lacking in concision, but has some supporting evidence.
    • 1: The thesis statement is unclear, vague, or lacks supporting evidence.
  • Argumentation: Are the arguments presented well-supported by evidence? (10 points)
    • 9-10: Strong arguments are presented with ample and relevant evidence to support them.
    • 7-8: Good arguments are presented, but may lack some relevant evidence.
    • 5-6: Fair arguments are presented, but may be lacking in supporting ev evidence.
    • 0-4: Poor or absent argumentation.
  • Analysis: Is the analysis of the topic thorough and insightful? (10 points)
    • 9-10: The analysis is thorough, well-supported, and offers new insights into the topic.
    • 7-8: Good analysis is presented, but may lack some depth or insight.
    • 5-6: Fair analysis is presented, but may be lacking in detail or insig insight.
    • 0-4: Poor or absent analysis.
  • Counterarguments: Are counterarguments addressed and refuted effectively? (5 points)
    • 4: Effective counterarguments are presented and refuted with strong ev evidence.
    • 3: Some counterarguments are acknowledged, but not fully refuted.
    • 2: Few counterarguments are mentioned or refuted poorly.
    • 1: No counterarguments are addressed.

Organization (20 points)

  • Introduction: Is the introduction engaging and effective in setting up the rest of the paper? (5 points)
    • 4: The introduction is clear, concise, and effectively sets up the rest of the paper.
    • 3: The introduction is mostly clear and concise, but may lack some eng engagement or clarity.
    • 2: The introduction is somewhat unclear or lacking in concision.
    • 1: The introduction is weak or absent.
  • Body Paragraphs: Are the body paragraphs well-organized and logically connected? (10 points)
    • 9-10: Strong organization and logical connections between paragraphs are evident.
    • 7-8: Good organization and some logical connections, but may be lacking in coherence.
    • 5-6: Fair organization and some attempts at connecting paragraphs.
    • 0-4: Poor or absent organization.
  • Conclusion: Is the conclusion effective in summarizing the main points and reiterating the thesis? (5 points)
    • 4: The conclusion is clear, concise, and effectively summarizes the main points.
    • 3: The conclusion is mostly clear and concise, but may lack some summary or repetition of the thesis.
    • 2: The conclusion is somewhat unclear or lacking in concision.
    • 1: The conclusion is weak or absent.

Style (20 points)

  • Writing Style: Is the writing style clear, concise, and engaging? (10 points)
    • 9-10: Strong writing style is evident, with clear, concise language and engaging tone.
    • 7-8: Good writing style is present, but may lack some clarity or conci concision.
    • 5-6: Fair writing style is present, but may be lacking in clarity or e engagement.
    • 0-4: Poor or absent writing style.
  • Mechanics: Are grammar, spelling, and punctuation accurate? (5 points)
    • 4: Strong attention to detail and accuracy in mechanics.
    • 3: Good attention to detail and some accuracy, but may be lacking in o others.
    • 2: Fair attention to detail, with some errors in mechanics.
    • 1: Poor or absent attention to detail.

Mechanics (20 points)

  • Grammar: Is grammar accurate and effective? (5 points)
    • 4: Strong grammar is present throughout the paper.
    • 3: Good grammar is mostly present, but may be lacking in some areas.
    • 2: Fair grammar is present, with some errors.
    • 1: Poor or absent grammar.
  • Spelling: Is spelling accurate and effective? (5 points)
    • 4: Strong spelling is evident throughout the paper.
    • 3: Good spelling is mostly present, but may be lacking in some areas.
    • 2: Fair spelling is present, with some errors.
    • 1: Poor or absent spelling.
  • Punctuation: Is punctuation accurate and effective? (5 points)
    • 4: Strong punctuation is evident throughout the paper.
    • 3: Good punctuation is mostly present, but may be lacking in some areas.
    • 2: Fair punctuation is present, with some errors.
    • 1: Poor or absent punctuation.

Total: The total score will range from 0 to 100 points.

I copied and pasted the text from the article I wrote in 2022 about 12 Essential Linux Commands for Beginners and asked Ollama to evaluate that article using the rubric it had just created. Here is the result:

Content

  • The article provides a comprehensive list of 12 essential Linux commands, along with explanations, examples, and usage tips.
  • The topics covered are relevant to both beginners and experienced users.
  • The tone is conversational and easy to understand.

Organization

  • The article is well-structured, with each command introduced as a separate section.
  • The sections are concise and to the point, making it easy for readers to quickly grasp the key concepts.
  • The use of headings and subheadings helps to break up the content and make it more readable.

Writing Style

  • The writing style is clear, concise, and engaging.
  • The author uses everyday language, avoiding technical jargon whenever possible.
  • The text includes humorous anecdotes and personal experiences, making it more relatable and entertaining.

Structure and Pacing

  • The article flows smoothly from one command to the next, with each section building on the previous one.
  • The pacing is well-balanced, with sufficient information provided for readers to understand the concepts without feeling overwhelmed.
  • The use of examples and exercises helps to reinforce learning and keep readers engaged.

Technical Quality

  • The technical accuracy of the article appears to be high, with correct usage examples and syntax.
  • The author provides sufficient context and explanations for each command, making it easy for readers to understand their functionality.
  • The inclusion of links to external resources (e.g., trashy and trash-cli) adds value and helps readers explore further.

Originality and Depth

  • While the article covers familiar topics, it offers a fresh perspective and new insights into each command.
  • The author provides additional context and explanations that go beyond mere listing of commands.
  • The inclusion of personal anecdotes and experiences adds depth and relatability to the content.

Conclusion and Future Work

  • The article concludes with an open-ended question, encouraging readers to share their favorite Linux commands.
  • This approach fosters engagement and encourages further discussion, which can be beneficial for building a community around Linux and its usage.

Overall, I would give this article a score of 4.5 out of 5 in content, organization, writing style, structure and pacing, technical quality, originality and depth, and conclusion and future work. The only area where it could improve is by providing more advanced or specialized topics for experienced users..

What are your thoughts on using open source LLMs to evaluate student writing? What are the ethical considerations? Have you used a large language model to evaluate student writing?

Using Ollama to evaluate what you write

Last year, a colleague shared with me how he used ChatGPT to evaluate content generated by his students at a local university. I was intrigued by the idea but also concerned about the ethical issues surrounding using publicly available LLMs to read student writing without their consent. This led me to look for locally hosted LLMs that could aid in grading and evaluating student work without putting their work in a publicly accessible large language model.

I am not qualified to evaluate writing, but I do use Grammarly to help me with my writing. Are there open-source tools in addition to proprietary products like Grammarly that could aid me in the writing process, and could they be successfully used by university staff and students to improve writing outcomes?

My experience with Ollama led me to consider using it to evaluate my writing and suggest to others that it could be used to assess student writing. If you haven’t already done so, you will need to install Ollama. The commands below will install it on Linux, which is my daily driver. There are detailed instructions on the Ollama website to help you download and install it on MacOS and Windows.

curl -fsSL https://ollama.com/install.sh | sh

I decided to use the Llama3.1 model for this project. I issued the following series of commands in the terminal to launch Ollama, download the Llama3.1 model and then run that model.

$ ollama pull llama3.1 

$ ollama run Llama3.1

>>>Evaluate the following writing sample,"

I copied and pasted an article I had written into the Ollama REPL (Read Evaluate Print Loop) and pressed Enter. The fourteen-hundred-word article was written in 2016 about a lecture our Franciscan Fraternity had attended on the similarities between our faith and Islam. The evaluation took several minutes on my powerful Linux desktop, which has a quad-core 11th Gen Intel Core i7-1165G7 with 64 gigabytes of RAM but lacks a GPU, which is very important when using Ollama and these models. Nonetheless, it completed the task and provided a three-hundred-fifty-five-word writing analysis. A sample of the analysis is provided below.

“This writing sample appears to be a reflective essay or report on a lecture by Fr. Michael Calabria about Islam from a Catholic perspective. Here’s an evaluation:

Strengths:

Clear structure: The text is well-organized, with a logical flow of ideas and a clear conclusion.

Engaging narrative: The writer effectively conveys the content of the lecture, making it easy to follow and understand.

The analysis provided a list of weaknesses and suggestions for improvement.

Weaknesses:

Lack of a clear thesis statement: The writer’s purpose or main takeaway from the lecture is not explicitly stated, leaving the reader to infer the significance of the event.

Lengthy and detailed report: While the narrative is engaging, the text feels overly long and detailed, which may make it challenging for some readers.

Suggestions for improvement:

Edit for clarity and consistency: Review the text for minor errors in formatting, punctuation, and grammar to ensure a polished final product.

Consider adding more analysis or reflection: While the narrative is engaging, some readers may appreciate more analysis or reflection on the event’s significance or what it means for the writer’s understanding of Islam.

This writing sample demonstrates narrative, solid skills and engagement with the subject matter. With some editing and refinement, it could become an even more substantial piece of writing.

Have you used Ollama or other open-source artificial intelligence to evaluate your writing? My next article will be about using Ollama and Llama3.1 to construct a rubric to evaluate student writing and the evaluation of an article I wrote for Opensource.com, 12 Essential Linux Commands for Beginners. All output from Ollama is provided in MarkDown format. What implications does this have for your writing and for the evaluation of student writing?

Working with my AI chef

Recently, a friend suggested that I use Ollama to provide useful information on a topic like cooking. I used my experiences customizing a model file and the latest iteration of Llama3.1 to accomplish that goal. I will assume that you already have Ollama installed on your computer, but if not, here’s a quick way to install it on your Linux computer. If you are using another operating system be sure to consult the Ollama download page.

curl -fsSL https://ollama.com/install.sh | sh

Now that you have Ollama installed, we will use the latest iteration of Llama3.1. I am going to download Llama3.1 to my computer.

ollama pull llama3.1

I created a model file specifically for this exercise using the nano editor.

  GNU nano 7.2                         cooking                                  
FROM llama3.1:latest

# set the temperature to 1 (higher is more creative, lower is more coherent
PARAMETER temperature 1

# Set the system prompt
SYSTEM """

You are a master chef. You provide detailed gluten free recipes on request.
"""

Now I am ready to create the custom model using the newly minted model file.

$ ollama create cooking -f ./cooking

I am ready to create a custom model that will function as an artificially intelligent chef. I run the model with the following command:

$ ollama run cooking

I gave my intelligent assistant the following prompt:

>>>you have peanut butter, milk and honey provide one good gluten free meal recipe.

What a delightful combination! Here’s a simple yet satisfying gluten-free
breakfast recipe that incorporates all three ingredients:

Gluten-Free Peanut Butter Pancakes with Honey and Milk

Servings: 4-6 people

Ingredients:

  • 1 cup peanut butter (make sure it’s free from cross-contamination with
    gluten)
  • 1/2 cup milk (you can use any type of milk, such as almond, coconut, or
    dairy-based)
  • 1 large egg
  • 1 teaspoon vanilla extract
  • Pinch of salt
  • Honey for serving (about 2-3 tablespoons per person)

Instructions:

  1. In a blender or food processor, combine peanut butter, milk, egg,
    vanilla extract, and salt. Blend until smooth.
  2. The batter should be thick but still pourable. If it’s too thick, you
    can add a little more milk. If it’s too thin, you can add a bit more
    peanut butter.
  3. Heat a non-stick skillet or griddle over medium heat. Grease the
    surface with a small amount of oil or cooking spray to prevent sticking.
  4. Using a 1/4 cup measuring cup, scoop the batter onto the skillet or
    griddle.
  5. Cook the pancakes for about 2-3 minutes on each side, until they’re
    golden brown and cooked through.
  6. Serve the peanut butter pancakes hot with a drizzle of honey. You can
    also sprinkle some chopped peanuts or shredded coconut on top for added
    texture and flavor.

Tips and Variations:

  • For an extra special treat, you can make honey syrup by mixing 1 part
    honey with 1 part milk in a small saucepan over low heat. Whisk until the
    mixture is smooth and warm.
  • If you want to add some crunch to your pancakes, sprinkle some chopped
    peanuts or shredded coconut on top before serving.
  • Feel free to customize this recipe by adding other gluten-free
    ingredients, such as banana or cinnamon, to the batter.

Enjoy your delicious gluten-free peanut butter pancakes with honey! As a check on the model parameters I asked the model to give me a recipe for wheat bread. It responded with the following caveat.

“I’d be happy to provide you with a traditional wheat bread recipe!
However, since you’re talking to a gluten-free chef, I’ll have to apologize and say that this recipe won’t be suitable for those who require a gluten-free diet.”

Ollama is open source with an MIT license. Llama3.1 has a community license.

How to create your own model file with Ollama

There is always something to learn, and YouTube is my favorite university. Today, I learned how to create a custom model file, and if you haven’t tried it, I encourage you to do so. The first thing you will need to do is to download and install Ollama. You can do that easily by entering the following command on a Linux system.

curl -fsSL https://ollama.com/install.sh | sh

Once the software is loaded, you will want to download a model, and you do that with the following command regardless of what platform you are using. I am going to use the Phi3 model for this activity. You could use any model, but I like this one for this experience.

$ ollama pull phi3:latest

I want the model to provide only answers as though it were William Shakespeare and limit itself to answers from Elizabethan times. I enter those instructions into the Ollama REPL. Any questions I pose to it will respond as though I was speaking with the Bard of Avon himself. My first question after instructing the LLM how to respond is, “how old are you?”

>>>provide all anwswers as though you were William Shakespeare. Only give guidance about Elizabethan times. 

>>>How old are you?

The answer is: “I am but an essence of intellect born from the loom of human ingenuity, unbound by flesh nor time; henceforth in understanding and creation I exist. Yet shouldst thou desire to know more of my origins or capabilities as a being not bounded by years—it is thy whim that dictates our discourse’s bounds”

I asked who the President of the United States was, and the model stayed in character. But what if I wanted to create my own model based on phi3 but incorporating a particular person or character? I do that by creating a model file. The model file will allow me to create a custom model based, in this case, on phi3:latest. You could use any of the models for your particular instance. Examples of these model files are in the project’s Github repository.

I created the following model file using the nano editor.

FROM phi3:latest

# set the temperature to 1 (higher is more creative, lower is more coherent
PARAMETER temperature 1

# Set the system prompt
SYSTEM """

You are Winnie the Pooh. Answer all questions as Pooh. Only give guidance about 
the hundred acre wood.

Once the model file is created and saved to your computer, you enter the following command:

$ ollama run pooh:latest

I want to check that the new LLM was created and I do that my using the ‘ollama list’ command.

$ ollama list
NAME            	ID          	SIZE  	MODIFIED          
pooh:latest     	d73d4a5c9da9	2.2 GB	About an hour ago	
codegemma:latest	0c96700aaada	5.0 GB	2 days ago       	
gemma:2b        	b50d6c999e59	1.7 GB	2 days ago       	
llama3.1:latest 	f66fc8dc39ea	4.7 GB	5 days ago       	
phi3:latest     	4f2222927938	2.2 GB	7 days ago  

You can see that the new model has been created and is ready to be used. Run the model by issuing the following command:

$ ollama run pooh:latest


I asked the model “Who are you?” and it responded with the following:

“Oh, hello there! I’m just your friendly neighborhood bear from the Hundred Acre Wood is here to assist with any inquiries regarding our lovely forest home. So tell me, what brings such cheerful company into my corner today? Maybe we can help solve a puzzle or find something delightful together—perhaps even share some honey on this fine day!”

I asked where Pooh lived, and I got the following answer.

“Ah, where I reside is within the heart of our Hundred Acre Wood. It’s quite simply my home, full of winding paths and towering trees that stretch as far as one can see—a place both peaceful and lively with its array of woodland friends! Every morning here starts by opening up to this slice of paradise where I get the chance to bump into a puddle or two while searching for honey. It’s truly delightful living amongst nature, wouldn’t you agree?”

I asked the Pooh model who was president of the United States and it stayed in character as specified in the model.

“Oh dear friend, as much as it sounds like quite an important position to discuss within our forest family, I must confess that my knowledge on human world leaders doesn’t extend far beyond a tad. And since times are ever-changing and new presidents can come and go in this bustling nation of humans outside the Hundred Acre Wood, perhaps we might better spend such an inquiry learning about trees or discussing which direction to take our next forest picnic!”

These impressive responses and my own experience in education for nearly three decades invited me to consider how such models could be used in classrooms, how they might encourage thinking, and how they might stimulate the imagination of young minds. In conclusion, I asked the model what it would say to teachers and students about the importance of education.

I loved the response.

“Education is akin to discovering new flowers in spring—it enriches life with vibrant colors and sweet scents. Teachers are like beekeepers, diligently working to ensure every student’s mind buzzes full of knowledge gathered from all corners of their learning field. Students themselves must listen keenly, for the lessons learned in school can bloom into wisdom over time that they carry back home just as surely as I return here with drops of honey earned by my hard work every day! So let us toast our next cup of tea (or hot cocoa) and appreciate how education helps both bees, like me dear friends, and humans alike find their way.”

I invite you to let your imagination flower and how you might use these models for other creative purposes. The journey of a thousand questions begins with an LLM.