How to Update and Install Packages on a Server Without Internet Using Another Server

Hello
I have two servers :

  • Server A (): This server has internet access.
  • Server B (): This server does not have internet access.

I want to use Server A as a local YUM repository so that I can update and install packages on Server B , even though it has no internet access.

You most likely need to build your own "repository mirror". Most Linux distributions have a dedicated documentation on how to create a "repository mirror" - refer to it.

2 Likes

ChatGPT 4o Suggests:

You can set up Server A as a YUM repository and configure Server B to use it as its package source. Here’s how:


Step 1: Set Up a Local YUM Repository on Server A

  1. Install Required Packages
    On Server A, install the required packages to create a YUM repository:

    sudo yum install -y createrepo httpd
    
  2. Create a Directory for the Repository

    sudo mkdir -p /var/www/html/repo
    
  3. Sync Packages from the Internet (Optional)
    If you want to mirror CentOS/RHEL repositories, you can use reposync:

    sudo yum install -y yum-utils
    sudo reposync --repoid=base --download-metadata --download-path=/var/www/html/repo
    
  4. Create the Repository Metadata

    sudo createrepo /var/www/html/repo
    
  5. Set Correct Permissions

    sudo chmod -R 755 /var/www/html/repo
    sudo chown -R apache:apache /var/www/html/repo
    
  6. Start the HTTP Server

    sudo systemctl start httpd
    sudo systemctl enable httpd
    
  7. Allow HTTP Access in the Firewall

    sudo firewall-cmd --permanent --add-service=http
    sudo firewall-cmd --reload
    

Step 2: Configure Server B to Use Server A’s Repository

  1. Disable Other Repositories (Optional)
    If you don’t want Server B to check external repositories, disable them:

    sudo yum-config-manager --disable "*"
    
  2. Create a New Repository Configuration File
    On Server B, create a new YUM repo file:

    sudo vi /etc/yum.repos.d/local.repo
    

    Add the following content:

    [localrepo]
    name=Local YUM Repository
    baseurl=http://<Server_A_IP>/repo
    enabled=1
    gpgcheck=0
    

    Replace <Server_A_IP> with the actual IP address of Server A.

  3. Clear and Rebuild the YUM Cache

    sudo yum clean all
    sudo yum makecache
    
  4. Test Installing a Package

    sudo yum install -y vim
    

Optional: Automate Syncing on Server A

To keep the repository updated, set up a cron job:

echo "0 3 * * * root reposync --repoid=base --download-metadata --download-path=/var/www/html/repo && createrepo --update /var/www/html/repo" | sudo tee -a /etc/crontab

This runs the sync every day at 3 AM.


Verification

  • From Server B, try running:

    yum repolist
    

    It should list "Local YUM Repository" as an available repo.

  • If the installation fails, check:

    sudo systemctl status httpd   # Ensure the web server is running
    sudo ls /var/www/html/repo/   # Ensure packages exist in the repository
    sudo curl http://<Server_A_IP>/repo/  # Test if Server B can access Server A
    

Let me know if you run into any issues!

Why not set up a proxy server, instead of a local repo?

ChatGPT 4o Replies;

Setting up a proxy server is a great alternative, especially if you want Server B to access external YUM repositories through Server A instead of maintaining a local repository. This approach avoids the hassle of syncing and maintaining a local repository while still allowing Server B to install and update packages.


Setting Up a Proxy Server on Server A

Server A will act as a caching proxy for YUM requests from Server B.

Step 1: Install Squid Proxy on Server A

On Server A, install and enable Squid:

sudo yum install -y squid
sudo systemctl enable --now squid

Step 2: Configure Squid to Allow Server B

Edit the Squid configuration file:

sudo vi /etc/squid/squid.conf

Add the following lines at the bottom:

acl mynetwork src <Server_B_IP>/32
http_access allow mynetwork

Replace <Server_B_IP> with the actual IP of Server B.

Then, restart Squid:

sudo systemctl restart squid

Step 3: Configure YUM on Server B to Use the Proxy

Edit the YUM configuration file on Server B:

sudo vi /etc/yum.conf

Add this line at the top:

proxy=http://<Server_A_IP>:3128

Replace <Server_A_IP> with the IP address of Server A.

Step 4: Test Package Installation on Server B

Run:

sudo yum clean all
sudo yum makecache
sudo yum install -y vim

If everything is set up correctly, YUM on Server B will fetch packages through Server A’s Squid proxy.


Why Use a Proxy Instead of a Local Repository?

Local YUM Repository Proxy Server (Squid)
Requires downloading and storing all packages manually. No need to manually sync or maintain a package repository.
Takes up disk space on Server A. Saves bandwidth by caching frequently used packages.
Can work without internet if pre-synced. Requires internet access but can cache updates.
Best for controlled environments with specific package versions. Best for dynamic updates and package availability.

If Server A has sufficient disk space and you want full offline control, go with a local YUM repo.

If you want automatic updates with caching, go with Squid Proxy.

Would you like additional configurations, such as authentication or SSL support for Squid?

2 Likes

IMHO don't do this!
It would make all the files executable.
And Apache does not need write access.

And don't alter /etc/crontab.
Instead either make a root crontab entry

0 3 * * * reposync --repoid=base --download-metadata --download-path=/var/www/html/repo && createrepo --update /var/www/html/repo

or create a system cronjob /etc/cron.d/myreposync.sh

0 3 * * * root reposync --repoid=base --download-metadata --download-path=/var/www/html/repo && createrepo --update /var/www/html/repo

But the repo mirror needs much disk space.
Therefore, I would go for the squid proxy.

1 Like

Yes, that would be how I would set things up, personally speaking.

Thank you for answering.

I never ask artificial intelligence about this type of work, they can make a very simple job very complicated.

A newly hired team member wrote a script here and ran it without questioning it, which led to bad results.

It is more reliable to get information from experienced people.

Actually, this is wrong.

People make mistake. AI makes mistakes.

It is better to know how to review what anyone advises you to do, experts, AI or your co-workers. Everyone makes mistakes, including human "experts".

Your statement shows you generally do not understand how to use advanced AIs like ChatGPT 4o.

Refer to: