Working with Ollama

Now that I’m working with Ollama, I needed to figure out how to locate the models on my storage medium and determine the amount of space they were occupying. This is especially important for the MacBook Air, which has a 256 GB drive that is already half full. My Linux computer, on the other hand, has a terabyte NVME drive, so I’m not as worried about storage there. However, I still like to track where everything is and how much storage it uses. I wanted to compile a list of the downloaded models and learn how to delete them.

After Ollama is installed you can get a list of the commands available to you by entering the following command in the terminal.

$ ollama help

Usage:
ollama [flags]
ollama [command]

Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command

Flags:
-h, --help help for ollama
-v, --version Show version information


Check the version of ollama you are using with:

$ ollama -v

ollama version is 0.3.9


Downloading models is easy using ‘ollama pull’

$ ollama pull gemma2

This downloads the model to your local computer. You can also initiate a pull request by running the following command:

$ ollama run phi3

This command sequence pulls the ‘phi3’ model down to your computer. After I had tried a few models I wanted to get a sense of how much space they were taking on my system and I used the ‘list’ or ‘ls’ switch of the ‘ollama’ command.

$ ollama list
NAME            	ID          	SIZE  	MODIFIED     
llama3.1:latest 	f66fc8dc39ea	4.7 GB	2 hours ago 	
codegemma:latest	0c96700aaada	5.0 GB	43 hours ago	
phi3:latest     	4f2222927938	2.2 GB	47 hours ago	
gemma2:latest   	ff02c3702f32	5.4 GB	3 days ago  	
phi3:medium     	cf611a26b048	7.9 GB	3 days ago  	
gemma2:2b       	8ccf136fdd52	1.6 GB	3 days ago  

Now I can see how much storage the models are using. It’s easy to delete a model. I decided to delete the oldest one first.

$ ollama rm gemma2
deleted 'gemma2'

Just to be sure, I ran ‘ollama ls’; sure enough, the model was gone.

NAME            	ID          	SIZE  	MODIFIED     
llama3.1:latest 	f66fc8dc39ea	4.7 GB	2 hours ago 	
codegemma:latest	0c96700aaada	5.0 GB	44 hours ago	
phi3:latest     	4f2222927938	2.2 GB	47 hours ago	
phi3:medium 

You can check on how many models are running on your machine with the following command:

$ ollama ps
NAME       	ID          	SIZE  	PROCESSOR	UNTIL              
phi3:latest	4f2222927938	6.0 GB	100% CPU 	4 minutes from now	

Using the ‘show’ command shows you information about the model.

ollama show phi3
  Model                                 
  	arch            	phi3  	                
  	parameters      	3.8B  	                
  	quantization    	Q4_0  	                
  	context length  	131072	                
  	embedding length	3072  	                
  	                                      
  Parameters                            
  	stop	"<|end|>"      	                   
  	stop	"<|user|>"     	                   
  	stop	"<|assistant|>"	                   
  	                                      
  License                               
  	Microsoft.                          	  
  	Copyright (c) Microsoft Corporation.

Stop a model from running by pressing CTRL +c. Exit the model by entering CTRL +d. In conclusion, Ollama has revolutionized our understanding and interaction with AI.

Embracing the feminine higher power

In the realm of spirituality and divine presence, one often encounters discussions revolving around gender-specific depictions of deities or higher powers within various religious traditions and cultural mythologies. While many have historically leaned towards masculine representations—often associated with strength, authority, and protection—the conceptualization of a feminine higher power brings to light the rich tapestry of female energy that permeates existence.

A woman as an embodiment or metaphor for divine love is not merely confined within ancient scriptures; contemporary spirituality also recognizes her influence in shaping our connections with ourselves, others, and ultimately all creation through this feminine higher power conceptualization. This essay delves into the characteristics of such a figure—one who might be envisioned as an ethereal being interwoven within nature’s own rhythm while guiding humanity towards harmonious coexistence with her nurturing, compassionate presence and wisdom that echoes through ages.

Firstly, the feminine higher power embodies a sense of balance—the Yin to masculine Yang in many spiritual frameworks—by offering healing energy during times when society has been imbalanced by war or conflict, poverty or social inequality, environmental degradation or collective despair. She is not merely an observer but rather actively engages with the Earth’s plight and seeks ways to mend its wounds through advocacy for ecological preservation, sustainability practices, and reverence towards natural resources that all too often are exploited without restraint or gratitude

Moreover, a feminine higher power exudes warmth and compassion—a beacon of light amidst the darkness within ourselves as well as in others who may suffer from isolation, trauma, mental illnesses, substance abuse issues among other things; all these struggles are not only acknowledged but addressed with genuine empathy. The nurturing essence is further expressed through fostering interconnected relationships amongst humans and the environment alike where every living being feels heard, seen, loved—a testament to her boundless love for creation in its entirety without conditions or boundaries attached thereto by societal constructs of gender roles; this inclusive approach invites everyone into experiencing wholeness under one roof.

Additionally, the feminine higher power embodies intuition and wisdom beyond rational thought processes—an innate sense that transcends conventional knowledge acquired through academics or professional careers alone which offers deeper insights towards understanding complex issues facing humanity such as climate change crises affecting indigenous populations worldwide who suffer from its consequences most severely. She holds an unconditional love for all living beings, including non-human entities like animals and plants whose existence is often disregarded by humans in their quest to dominate nature rather than coexist with it harmoniously; this divine feminine presence advocates respecting the interconnectedness of life where every being plays a vital role within an intricate web that forms our beautiful planet Earth.

Furthermore, she symbolizes creativity and artistic expression—a powerful tool used to heal wounds caused by societal oppressions rooted in gender-based discrimination; the embodiment of this force manifests itself through various mediums such as dance forms like Sufi whirling or Shakti Tandava Nritya which transcend language barriers and bridge cultural divides while also offering therapeutic benefits for mental health. Her creativity lies not only within art but extends to innovative solutions towards solving global problems that require collaborations between diverse communities—where ideas flow freely across cultures, religions or social backgrounds without judgment-driven barriers inhibiting mutual understanding and respect amongst humanity at large; this is an embodiment of her visionary leadership style where she guides societies towards progressive change through dialogue rather than imposing authority.In conclusion, a feminine higher power represents not only love but also patience—the willingness to nurture growth within ourselves and others despite external circumstances that may present obstacles along the way; this divine presence serves as an anchor for those seeking solace amidchaos or guidance through life’s complexities with graceful resilience.

Embracing her feminine essence can inspire individuals across genders to embody compassion, creativity and intuition while promoting balance within personal relationships—as well as fostering unity amongst global communities that transcend traditional barriers imposed by societal norms which have historically led towards division rather than collective growth; this invites us all into embracing a more inclusive worldview where every voice is heard, valued and respected regardless of gender or cultural background.

In essence, the conceptualization of feminine higher power serves as an empowering tool for spiritual seekers yearning to embrace love’s transformative energy within themselves while extending it towards others—an everlasting reminder that true strength lies not in dominance but interdependence rooted deeply-embedded values such as compassion, creativity and intuition which nurture the Earth itself back into equilibrium; this divine feminine presence calls upon humanity to evolve beyond our flawed perceptions of gender roles while uniting us towards global harmony that honors every living being within creation’s vast tapestry.

From a writing prompt, “Write an essay about what a feminine higher power might be like. (OLlama with Phi3 model) “

Assault Weapons: Reflections of Societal Illness and the Urgent Need for Comprehensive Reform in America’s Gun Culture

Assault weapon ownership, often misinterpreted as an expression of patriotic zeal or individual right within the United States, may in fact mirror symptoms and patterns indicative of a broader societal condition –that of psychological distress stemming from unchecked aggression. At face value, these formidable weapons are legal under current regulations; however, they represent more than mere tools for self-defense or hunting but symbolize an eroding demarcation between firearm and mentality as conduits to violence when misused by individuals with antisocial traits potentially indicative of underlying mental health conditions.

The prevalence of assault weapons among mass shooters hints at a disturbing correlation, not necessarily causal, but significant enough that it invites further scrutiny into the psychological makeup and intentions behind such acts – an exploration that could be beneficial in averting future tragedies. As we grapple with mental health crises affecting various demographics within society, including a disproportionately higher prevalence of diagnosed conditions among gun owners compared to non-owners, the link between weapon ownership and aggression becomes even more pertinent for discussion amongst psychologists.

Furthermore, it is worth considering whether allowing accessibility to such potent armaments inadvertently encourages a culture that normalizes or romanticizes violence as an acceptable form of expression – traits that may align with impulsive and aggressive behavior often associated with psychological profiles. This could potentially foster resentment among those who feel marginalized, further exacerbating societal rifts already present in the American cultural fabric.

The potential for these devices to be wielded as tools of terror or intimidation cannot be overlooked; their very existence within civilian hands poses an unprecedented risk that society must address holistically – incorporating insights from mental health professionals and sociologists alike, with a shared vision towards fostering empathy, understanding, healing wounds of disparities that feed resentment in the heartland. In essence, assault weapon ownership may be symptomatic, not just of individual distress but reflective of broader social pathologies calling for urgent and comprehensive societal introspection – recognizing mental health as integral to our national psyche’s well-being demands more than legislation; it calls upon a collective moral fortitude.

This essay was written from a prompt by Ollama using the Phi3 model

Breaking Free from the Cloud: Exploring the Benefits of Local,Open-Source AI with Ollama

Everywhere you look, someone is talking or writing about artificial intelligence. I have been keenly interested in the topic since my graduate school days in the 1990s. I have used ChatGPT, Microsoft Copilot, Claude, Stable Diffusion, and other AI software to experiment with how this technology works and satisfy my innate curiosity. Recently, I discovered Ollama. Developed by Meta, it is an open-source large language model that can run locally on Linux, MacOS, and Microsoft Windows. There is a great deal of concern that while using LLMs in the cloud, your data is being scraped and reused by one of the major technology companies. Ollama is open-source and has an MIT license. Since Ollama runs locally, there is no danger that your work could end up in someone else’s LLM.

The Ollama website proclaims, “Get up and running with Large Language Models.” That invitation was all I needed to get started. Open a terminal on Linux and enter the following to install Ollama:

curl -fsSL https://ollama.com/install.sh | sh

The project lists all the models that you can use, and I chose the first one in the list, Llama3.1. Installation is easy, and it did not take long to install the Llama3.1 model. I followed the instructions and, in the terminal, entered the following command:

$ ollama run llama3.1

The model began to install, which took a couple of minutes. This could vary depending on your CPU and internet connection. I have an Intel i7 with 64 GB RAM and a robust internet connection. Once the model was downloaded, I was prompted to ‘talk’ with the LLM. I decided to ask a question about the history of my alma mater, St. Bonaventure University. I entered the following commands:

$ ollama run llama3.1
>>>What is the history of St. Bonaventure University?

The results were good but somewhat inaccurate. “St. Bonaventure University is a private Franciscan university located in Olean, New York. The institution was founded by the Diocese of Buffalo and  has a rich history dating back to 1856.” St. Bonaventure is located near Olean, New York, and it is in the Diocese of Buffalo, but it was founded in 1858. I asked the model to name some famous St. Bonaventure alumni; more inaccuracies were comic. Bob Lanier was a famous alumnus but Danny Ainge was not.

The results are rendered in MarkDown, which is a real plus. I also knew that having a GPU would render the results much quicker. I wanted to install Ollama on my M2 MacBook Air which I soon did. I followed the much easier directions: Download the Ollama-darwin.zip, unzip the archive, and double-click the Ollama icon. The program is installed in the MacBook’s Application folder. When the program is launched, it directs me to the Mac Terminal app, where I can enter the same commands I had entered on my Linux computer.

Unsurprisingly, Ollama uses a great deal of processing power, which is lessened if you run it on a computer with a GPU. My Intel NUC 11 is a very powerful desktop computer with quad-core 11th Gen Intel Core i7-1165G7, 64 gigabytes of RAM, and a robust connection to the internet to download additional models. I posed similar questions to the Llama3.1 model first on the Intel running Linux and then on the M2 MacBook Air running MacOS. You can see the CPU utilization below on my Linux desktop. It’s pegged, and the output from the model is slow at an approximate rate of 50 words per minute. Contrast that with the M2 MacBook, which has a GPU with a CPU utilization of approximately 6.9% and words per minute faster than I could read.

Screen picture by Don Watkins CC by SA 4.0

While Ollama Llama3.1 might not excel at history recall, it does very well when asked to create Python code. I entered a prompt to create Python code to create a circle without specifying how to accomplish the task. It rendered the code shown below. I had to install the ‘pygame’ module, which is not on my system.

$  sudo apt install python3-pygame
# Python Game Development

import pygame
from pygame.locals import *

# Initialize the pygame modules
pygame.init()

# Create a 640x480 size screen surface
screen = pygame.display.set_mode((640, 480))

# Define some colors for easy reference
WHITE = (255, 255, 255)
RED = (255, 0, 0)

while True:
    # Handle events
    for event in pygame.event.get():
        if event.type == QUIT or (event.type == KEYDOWN and event.key == 
K_ESCAPE):
            pygame.quit()
            quit()

    screen.fill(WHITE)  # Fill the background with white color

    # Drawing a circle on the screen at position (250, 200), radius 100
    pygame.draw.circle(screen, RED, (250, 200), 100)
    
    # Update the full display Surface to the screen
    pygame.display.flip()

I copied the code into VSCodium and ran it. You can see the results below.

Screen picture by Don Watkins CC by SA 4.0

As I continue experimenting with Ollama and other open-source LLMs, I’m struck by the significance of this shift toward local, user-controlled AI. No longer are we forced to rely on cloud-based services that may collect our data without our knowledge or consent. With Ollama and similar projects, individuals can harness the power of language models while maintaining complete ownership over their work and personal information. This newfound autonomy is a crucial step forward for AI development and I’m eager to see where it takes us.

Where politicians fear to tread…

A nation weeps, a mother cries,
Tiny hands clutch at crimson skies.
“Just keep them safe,” they whisper low,
But hollow words, the truth don’t show.

In halls of power, shadows dance,
While children tremble, afraid to glance.
Gutless men, in ivory towers,
Bow to pressure, silencing powers.

They cater to a lobby’s plea,
For profits made and safety fee.
Blind eye they turn on innocent lives,
Clandestine deals beneath the tides.

The echo of their promises rings hollow,
A symphony of fear in every poll.
While school halls become battlegrounds rife,
And innocence fades into the knife.

How can they stand and claim “common sense,”
When lives are lost in this gun-fueled trance?
Their cowardice a bitter pill to swallow,
As innocence bleeds on the playground’s hollow.

They preach of freedom, but at what cost?
While children pay for their moral frost.
Rise from slumber, break the chains of fear,
Stand for justice, let the truth be clear!

Poem from Gemma2 Large Language Model from a writing prompt. “Write a poem about the plight of gutless politicians craven to the gun lobby while children lives are endangered in America’s classrooms.”

Harnessing the Power of Local AI: Exploring Ollama and Llama 3.1 for Advanced Language Modeling

Just yesterday, I came across Ollama, an open-source project developed by Meta. I had been helping a friend learn more about locally hosted large language models, and my internet search led me to Ollama. I installed it quickly on my Linux Mint desktop using the code supplied on the community website.

curl -fsSL https://ollama.com/install.sh | sh

It didn’t take too long to install the software and then I wanted to explore how it worked. I decided to download and install the recommended Llama3.1 model. I followed the instructions and in the terminal entered the following command:

$ ollama run llama3.1

The model began to install, which took a couple of minutes. This could vary depending on your CPU and internet connection. I have an Intel i7 with 64 GB RAM and a robust internet connection. Once the model was downloaded, I was presented with a prompt to ‘talk’ with the LLM.

>>> What is the history of St. Bonaventure University? 

“St. Bonaventure University, a private Franciscan university located in Allegany County, New York, has a rich and storied history that spans over
160 years.”

It’s not bad, but it’s not entirely accurate either. The results were rendered in MarkDown, which is a real plus. I also knew that having a GPU would render the results much quicker. I wanted to install Ollama on my M2 MacBook Air.

I followed the much easier directions: Download the Ollama-darwin.zip, unzip the archive, and double-click on the Ollama icon. The program is installed in the MacBook’s Application folder. When the program is launched, it directs me to the Mac Terminal app, where I can enter the same commands I had entered on my Linux computer.

I ran the same commands on the Mac to install Llama3.1 the first time. I asked for the history of St. Bonaventure University in Olean, New York. The results appeared much more quickly, but their accuracy was skewed even more. They were so inaccurate that they were comical.

“The institution was founded in 1858 by five Jesuit priests from the Buffalo Diocese as St. Bonaventure College. The college was established on a 200-acre farm in Deerpark, New York, with the goal of providing a Catholic education to young men.”

I downloaded and installed the Gemma 2 model from Google to see if it was faster and more accurate. The commands are the same whether using the Linux or Mac terminal.

$ ollama run gemma2:2b

Gemma 2:2b is the smallest model, at 1.6 GB, and it was downloaded and installed quickly. The results were similarly inaccurate, but they rendered much more quickly on both the Mac and Linux computers. Llama3.1 was only mildly accurate for information that occurred in 2021 and prior. I asked Gemma2 who was running for President of the United States and its information was only accurate as of October 26, 2023.

Despite the results, I am encouraged that it is possible to run large language models on PCs.

How to Switch from RAID to AHCI in BIOS for Better SSD Performance

I’m always shopping for laptops to outfit with Linux and one of my favorite stops is DellRefurbished.com. On the weekend while surfing that site I came upon a good deal on a Dell Latitude 5410 laptop. The (new to me machine) came with an i7 processor. Using the ‘inxi’ command it easy to see that this laptop was a good bargain at forty-five percent off a list price of $399.

The battery is in great shape for a three-year-old laptop that weighs a little bit over three pounds. I’ve been using Dell Computers with Linux loaded on them for over twenty years. However, lately, these Latitude 5410s have come with Intel Rapid Storage Technology for use with the Windows 10 Professional operating system that came pre-installed on the laptop. Linux Mint would not install when starting from a live USB drive.

The solution that I quickly found when upgrading one of these laptops a couple of weeks ago is as follows:

When starting up the Latitude press the F2 function key to enter the BIOS and look for the SATA controls and change from RST to AHCI. Save the settings in your BIOS restart the computer and complete the installation of Linux Mint.

For more information on this particular issue, you should consult the excellent documentation from Ubuntu. https://help.ubuntu.com/rst/

Sharing Tech for Good: How I Gifted My Neighbor a Refurbished Linux Mint Laptop

I volunteer with Meals on Wheels once a week, and one of the clients on my route is a man my age. We frequently spend time visiting when I bring his meal to him. Last week, when I stopped, he asked me if I could help him put his laptop computer back in working order. I told him I’d do him one better—installing Linux Mint on his ailing computer and extending its life.

After checking with him about the model of his laptop, I discovered that it was a lightly powered notebook with minimal disk storage. I had a five-year-old System76 Darter Pro just sitting around gathering dust. I decided to install Linux Mint Cinnamon 22, which is my daily driver. Though the Darter Pro is five and a half years old, it’s still got lots of life in it. It has an Intel i7 CPU with 16 gigabytes of RAM and a 250 gigabyte NVME drive, which would be a fantastic host for my client.

When I shared my decision with him, he was delighted, and though he trusted my judgment, he knew nothing about Linux Mint and wondered if he could master the learning curve. I told him that Mint Cinnamon has many similarities with the Windows 10 desktop he used to. I searched YouTube and other sites for an excellent introduction to Linux Mint and could not find the kind of documentation I wanted to give to my friend. I decided to write my documentation, complete with screenshots of the critical first steps to get started with a Linux Mint laptop.

This is the first screen you see after you start up the laptop. His name is Tom, and I shared his password in the documentation. I knelt next to him as I helped him through these first steps.

Linux Mint Login screen

The next thing he sees is the Welcome to Linux Mint splash screen. I provide a brief explanation and then encourage him to return here later to refer to the documentation and help information.

I direct his attention to the screen’s lower right-hand corner and guide him through the connection to his wireless access point and network connection.

I explained to him that the fourth item from the left is the ‘network connection’ icon, and he needs to click on it to display the wireless networks that are available to connect with. We quickly found his local connection.

He was unsure of his wireless password, but after some reflection and connecting with others who connect to his internet service provider, I remembered that they use the client’s mobile phone number for a password. That worked well, and we were connected. I walked him through installing the necessary updates on his new Linux laptop. I quickly demonstrated the security of Linux systems as each update required his password. Linux Mint comes with Firefox by default but I also showed him how to add the Google Chrome browser and some other applications using the software manager.

Eventually, I showed him how to use most of the essential tools of this new Linux Mint computer, including ‘Nemo’, the Linux Mint file manager.

After almost forty-five minutes of initial setup and instruction, he was ready to dive in and experience Linux Mint firsthand. This morning, I received an email from Tom, and it warmed my heart. He said, “I was on my new computer till 11:00 last night. I’m like a kid with a new toy. I want to thank you for the info you sent on linux.I perused the emails and quickly learned I would never be a Linux expert…Again, a thousand thanks for the laptop. I was in a bind, and you really got my feet from the fire. Sincerely, tom.”

His email made my day, so I love to share the Linux and open-source journey with others. We can extend the hardware’s life and demonstrate the effectiveness of the open-source approach.

We were raisins

On August 25, 1972, we picked up our new seabags, which were full of the uniforms we had received after the first two days of recruit training and walked in company file from Camp Barry to Camp Moffit, the main area of Recruit Training Command at the Great Lakes Naval Training Center.

Most of us arrived two days earlier, on August 23, 1972. I’ll never forget setting on a little square in this wooden building, where I arrived via a bus from O’Hare Airport. I had that hollow feeling mixed with dread at what the future held for me. I saw this as a death sentence even though I had volunteered for the United States Naval Reserve just two months earlier. Late in the day, near dusk, we had our first Navy chow, and it was terrible. I think it was shit on a shingle or some other unpalatable meal selection. Then we were marched back to the processing center, and eventually, we got put to bed in an open bay barracks.

On the morning of the twenty-fourth of August, we got up early, probably 0700 or earlier. We stripped down to birthday suits and received our new Navy-issue underwear, blue denim trousers, and, later, long-sleeved blue shirts. We received our vaccinations, had medical and dental exams, and shipped our civilian clothes home in boxes provided by the Navy. I often think of the people who refuse vaccinations now. On that August in 1972, no one said, “Would you like a vaccination?” They said, “Next!” The guy ahead of me in the vaccine line got his arm lacerated by the vaccine gun because he flinched. By the end of the day on the twenty-fourth, we had dinner and then returned to our temporary barracks, but by this time, we were in uniform and meeting the other young men who made up Company 72-351. MMC William W. Boyd commanded our company. We were fortunate to have Chief Boyd. He took no crap, but he was very fair, and we came to love him over the next seven weeks of our journey from civilians to members of the United States Navy.

On Friday, August 25, we rose early and had breakfast. After packing our seabags, we marched as a company from the processing area of Camp Barry to Camp Moffet, home of the Recruit Training Command Great Lakes, Illinois. We had received our white hats, but we couldn’t wear them until they and all our gear were appropriately stenciled. We marched to our new barracks wearing our wool watch caps. We were called ‘raisins.’ There’s a pecking order in recruit training, and we were on the low end. It would be seven more weeks and after rigorous training in marching learning the uniform code of military justice (UCMJ). Nuclear biological and chemical warfare training, firefighting, and learning every aspect of life in the United States Navy would prepare us to serve the fleet’s needs.

We came to our new home in the new barracks, which seemed like Holiday Inn after our first two days of processing. We were assigned our bunks. My bunkmate was from Texas. His name was Chris Meador. One of our neighboring bunkmates was Jerry Horton, who was also from Texas. There was Tom Carlin from Philadelphia, who had graduated from Villanova University only a few months earlier. Our company comprised all United States Naval Reserve members, and we were at Great Lakes on ACDUTRA, which is “active duty for training.” Some of us were 2×6 reservists, which meant after our recruit training and ‘A’ schools, we would serve two years of active duty wherever the Navy assigned us, and then we would serve the balance of our six-year commitment as part of the active reserve attending monthly drills and then two weeks ACDUTRA in the summer. A few were 4×10 reservists who went to recruit training, ‘A’ school, and then returned to their home unit where they would serve the balance of their enlistment in the active reserve.

I remember that journey and the young men who became my shipmates every year. I remember Chief Boyd and his role in preparing us for naval service. The last time I saw us all together was the morning of October 13, 1972, when we were ready to march in review for our graduation from recruit training. I had been chosen to lead the battalion onto the drill field carrying the United States flag. I was the tallest and had an excellent military bearing, which was the criterion for the assignment. When I returned to the barracks after graduation to pick up my seabag and make my way across the street to the United States Navy Hospital Corps School at Great Lakes it was empty. There was a sense of loss amid the exhilaration of completing seven weeks of training.

A couple of the guys from our recruit company joined me at Hospital Corps school. I never saw the rest again. We planned to get together one day at a bar in Manhattan called McSorley’s Old Ale House. That never happened, but talk like that united us and gave us hope for a future after recruit training. Tom Carlin and I stayed in touch by occasional mail after Great Lakes, and many years later, we reconnected via phone call thanks to LinkedIn. Many of us are grandfathers and great-uncles now. I made it back to Great Lakes and Recruit Training Command sixteen years ago as I watched my nephew graduate in mid-August of 2008. That day, I bought a Navy baseball cap at the gift shop and napped under a tree near the drill field where we marched. I remembered the young men of Company 351, and sometimes, even now, I can hear their voices and remember our time together.

Five things you can do with the nano editor

In the early stages of my experience with Linux servers, I had to learn how to edit text files using the command line. While there are other powerful text editors in Linux, such as vi and vim, I found Nano to be particularly useful. Nano is a simple yet powerful text editor that comes pre-installed on many Linux distributions. You can easily install it from the command line if it’s not pre-installed on your system.

Debian-based systems:

$ sudo apt install nano

RPM based systems:

$ sudo dnf install nano

Basic Text Editing

Nano is a user-friendly text editor designed for simple and efficient text editing. To open a file, type “nano” followed by the file name in the terminal. Once inside, you can begin typing or editing text immediately. Navigation is easy, using the arrow keys to move around. To save your changes, press Ctrl + O; to exit, press Ctrl + X.

Screen picture by Don Watkins CC by SA 4.0

Search and Replace

Nano has a valuable search and replace feature. To search for a specific term, press Ctrl + W, type your search term, and press Enter. To replace text, press Ctrl + \, enter the text you want to replace, followed by the new text. This feature is handy for quickly updating configuration files or scripts.

Undo and Redo

Mistakes happen, but Nano makes it easy to correct them with its undo and redo functionality. Press Alt + U to undo the last action and Alt + E to redo it. This feature ensures that you can quickly revert changes without losing your progress

Syntax Highlighting

Nano offers syntax highlighting for those working with code, making reading and editing code easier. Syntax highlighting is available for various programming languages and can be enabled by adding the appropriate syntax files to your Nano configuration.

Screen picture by Don Watkins CC by SA 4.0

Custom Key Bindings

Nano enables you to customize key bindings to match your workflow. You can edit the /etc/nanorc file to modify default key bindings or add new ones. This flexibility allows you to personalize the editor based on your specific requirements, enhancing your editing experience and making it more efficient.

Nano’s simplicity and powerful features make it a great choice for text editing in Linux. Whether editing configuration files, writing scripts, or taking notes, Nano has the tools to do the job efficiently.