Practical Linux Commands for Real Life – Part 2

Practical Linux Commands

Practical Linux Commands for Real Life -- Part 2

This article is part of the Linux Commands for Real Life series, you can also watch the video tutorial here

Please follow this link for part -- 1.


Using cut command with cat

Sometimes when i am writing a script and i want to find out the name of the server, network type...etc i use the below command.

I use cat to screen all the file content, i use grep after to find specific line then i cut only specific part.


cat /etc/sysconfig/network-scripts/ifcfg-eth0 | grep -i TYPE | cut -d "=" -f 2




grep -i type -- Find line which has type
cut -d "=" -f 2 -- use "=" as a delimeter and -f 2 means show  the second column.

Another example:

if you have a file which contains


Then you use cat that file and -d "=" and -f 3 then the output will be '4' as 4 is the third column.


Change text color using tput

You can change text output color/bold/italic/backgroun color using tput.


echo "$(tput smul)$(tput setaf 1)Text$(tput sgr0)":





  • $(tput smul) -- Underline
  • $(tput setaf 1) -- Text color is red
  • $(tput sgr0) -- this part you should put it at the end to get the text back to normal, if you don't put it all the tyle will be applied to all the next written text.


Sort processes by CPU usage


ps -e --sort -pcpu -o comm,pcpu,user,pid | head -11


systemd 50.0 root 1
kthreadd 44.0 root 2
ksoftirqd/0 38.0 root 3
kworker/u30:10 0.0 root 6
migration/0 8.0 root 7
rcu_bh 0.0 root 8
rcu_sched 0.0 root 9
watchdog/0 0.0 root 10
khelper 0.0 root 12
kdevtmpfs 0.0 root 13


--sort -pcpu , Sort by CPU Usage
-o comm,pcpu,user,pid , show output columns of cpu, user,process id.
| head 11 , show top 11


Rename multiple files with specific name with number sequence

Below example will rename multiple files which starting with x to data01.csv , data02.csv ...etc

ls x[a-z]* | while read file;do let c++;mv $file data$(printf "%02d" $c)".csv";done


ls x[a-z]* -- List all the files which start with 'x' letter
while read file;do let c++ -- Loop while listing all the files and do the following
mv $file data$(printf "%02d" $c)".csv";done -- Move all the files with naming them 'data' and have 0 then number of the loop of $c and place extension .csv


Check each installed packaged information and sort by date

Not all the installed packages' information can be retreived using yum, if packages where installed using rpm you can query and find installation date using below command:


rpm -qa --qf  '%{INSTALLTIME} (%{INSTALLTIME:date}): %{NAME}-%{VERSION}-%{RELEASE}.%{ARCH}\n' | sort -n | tail -n5

Output sample:

1496198202(Tue 30 May 2017 10:36:42 PM EDT):mysql-community-common-5.7.18-1.el7
1496198203(Tue 30 May 2017 10:36:43 PM EDT):mysql-community-libs-5.7.18-1.el7
1496198207(Tue 30 May 2017 10:36:47 PM EDT):mysql-community-client-5.7.18-1.el7
1496198234(Tue 30 May 2017 10:37:14 PM EDT):mysql-community-server-5.7.18-1.el7
1496198235(Tue 30 May 2017 10:37:15 PM EDT):mysql-community-libs-compat-5.7.18-1.el7


rpm -qa --qf '%{INSTALLTIME}(%{INSTALLTIME:date}):%{NAME}-%{version}-%{RELEASE}\n' | sort -n | tail -n 5

rpm -qa -- List and query all installed packages.
--qf '%{INSTALLTIME}' -- This will show file information of Install Time and all the rest of parameters have different info like date, name, version..etc
\n' -- To have each package in a new line
sort -n -- Sort the list
tail -n5 -- List only the latest 5 installed packages


Find Apache uptime or when was restarted

Method #1

apachectl status | grep -i active


Method #2

locate error_log
cat /var/log/httpd/error_log | grep -i resum


List files by last accessed ones

ls -latu

This command will differ than ls -lat that it will show last accessed even if it was not modified.


Which processes using certain file


fuser /var/log/httpd/error_log

Output example:

/var/log/httpd/error_log: 27431 27432 27433 27434 27435 27436

Investigate certain pid

ps aux | grep -i 27431

Output will show that httpd is running, now we can know that httpd is using that file.




Subscribe to
for video tutorials updates