Shellman is your friendly CLI assistant β a curated toolbox for working with files, data and your system.
git clone https://github.com/Shellman-Project/shellman.git
cd shellman
A virtual-env is recommended:
python -m venv .venv
source .venv/bin/activate # Linux / macOS
.venv\Scripts\activate # Windows
pip install -r requirements.txt
pip install .
Using Poetry instead?
poetry install
pip install -e .
(Any code or help-file change is picked up instantly β re-install only if you rename/move modules.)
List all commands:
shellman --help
Every command supports multi-language help via:
shellman <command> --lang-help eng # English help
shellman <command> --lang-help pl # Polish help
Example:
shellman lines README.md --count --show-size --contains shellman/com --extract --before 2 --after 2 --percent
==> README.md <==
88:
89:Pull-requests and feature ideas are welcome!
90:Each command lives in shellman/commands/ β feel free to add your own tool.
91:
92:########################################################################################
--------------------
182:
183:Masz pomysΕ lub poprawkΔ? ZgΕoΕ pull-request lub otwΓ³rz issue.
184:KaΕΌde narzΔdzie znajduje siΔ w shellman/commands/ β ΕmiaΕo dodaj wΕasne!
185:
==> README.md <==
Matching lines: 2
Match percentage: 1.08%
File size: 5.31 KB
command short description:
Command | Purpose | Quick example |
---|---|---|
excel | Excel utilities: info / preview / export | shellman excel info data.xlsx |
find_files | Locate files by name, content or extension | shellman find_files ./src --name util |
change_line_end | LF β CRLF conversion / check | shellman change_line_end file.txt |
checksum_files | Create or verify checksums (sha256, md5, sha1) | shellman checksum_files ./downloads |
clean_files | Delete junk files by name, ext or age | shellman clean_files ./tmp --ext log |
count_lines | Count lines in files | shellman count_lines ./src |
extract_lines | Extract or summarize lines with filters and context | shellman extract_lines ./src --contains TODO |
csv_extract | Extract columns/rows from CSV | shellman csv_extract data.csv --cols 1,3 |
date_utils | Add/subtract dates, diff, strftime | shellman date_utils --add 5d |
encrypt_files | AES-256 encrypt / decrypt with password | shellman encrypt_files secret.txt |
file_convert | Convert between JSON, YAML, TOML | shellman file_convert config.toml json |
file_stats | Show path, size, line-count, extension | shellman file_stats ./src |
json_extract | JSON path, filter and field selection | shellman json_extract data.json --path user.name |
merge_files | Merge multiple text files into one | shellman merge_files ./logs -o merged.txt |
replace_text | Batch find-&-replace with diff preview | shellman replace_text ./docs --find old --replace new --preview |
sys_summary | System / shell / resource report | shellman sys_summary |
zip_batch | Bulk ZIP creation (per-folder or flat) | shellman zip_batch ./projects |
open_ports | Show currently open TCP/UDP ports (process, pid, address, state) | shellman open_ports |
speed_test | Run a quick internet speed test (download, upload, ping) | shellman speed_test |
dir_tree | Prints a visual tree of directories (like tree ) |
shellman dir_tree ./src -f -d 2 |
Plus a few helpers: checksum_files, date_utils, etc. Run --help on any sub-command for full docs.
Convert or check LF/CRLF line endings in files or folders.
This command allows you to:
- Convert line endings in files or entire directories between LF (Unix,
\n
) and CRLF (Windows,\r\n
) formats. - Check the line ending type of files and quickly detect if files contain mixed line endings.
This is especially useful when working on projects across different operating systems or collaborating with others, to ensure consistency and avoid hidden issues.
shellman change_line_end [OPTIONS]
Options
--file PATH
Path to a single file to process.
--dir PATH
Path to a directory to process all files inside recursively.
--ext EXT
Only process files with this extension (e.g., --ext py). Requires --dir.
--to {lf,crlf}
Convert all matching files to the chosen line ending type.
--check
Only check and report the line ending type per file, do not modify.
--lang-help {pl,eng}
Show this help in Polish or English instead of executing the command.
- Check line endings of a file
shellman change_line_end --file script.py --check
Output:
script.py β CRLF
- Convert a single file to LF (Unix)
shellman change_line_end --file README.md --to lf
Output:
β converted to LF: README.md
- Recursively convert all .txt files in a directory to CRLF (Windows)
shellman change_line_end --dir docs --ext txt --to crlf
- Check all .sh scripts in a folder for line ending types
shellman change_line_end --dir scripts --ext sh --check
Possible outputs:
scripts/install.sh β LF
scripts/legacy_convert.sh β MIXED
You must specify either --file or --dir.
Either --to or --check is required.
Mixed line endings (MIXED) indicate a file contains both LF and CRLF. It's best to normalize these. The extension filter (--ext) should be used with --dir. The command works on all text files, but binary files may yield incorrect results.
Generate or verify file checksums (SHA256, MD5, SHA1) in directories and projects.
This command allows you to:
- Generate checksums for all files in a directory (with optional extension filter), saving the list to a file.
- Verify the integrity of files based on a previously generated checksum list (detects tampering or corruption).
- Choose popular algorithms: SHA256 (default), MD5, or SHA1.
Checksums are essential for verifying data integrity after transfers, backups, or before deployment.
shellman checksum_files [OPTIONS]
Options
--path PATH
Directory to scan (default: current directory .).
--ext EXT
Only include files with this extension (e.g., --ext py).
--algo {sha256,md5,sha1}
Select hash algorithm (default: sha256).
--out FILE
Name of the output file for checksum list (default: checksums.sum).
--verify
Verify files using an existing checksum list file (reads from --out).
--lang-help {pl,eng}
Show this help in Polish or English.
- Generate SHA256 checksums for all files in the current directory
shellman checksum_files --algo sha256
Creates checksums.sha256sum with contents like:
d2f0b8... ./script.py
342ab9... ./data/config.json
- Generate checksums for all .txt files in docs directory using MD5
shellman checksum_files --path docs --ext txt --algo md5 --out docs.md5list
- Verify files using a previously generated checksum list
shellman checksum_files --verify --out docs.md5list
Output:
π Verifying files via md5 list docs.md5list ...
β
OK: docs/notes.txt
β MISMATCH: docs/draft.txt
β docs/old.txt not found
The extension filter (--ext) works only with the --path directory option.
The default output file is checksums.sum, e.g., checksums.sha256sum.
When verifying, all files and their hashes are checked against the list; any mismatch or missing file is reported.
The command works on all file types, but binary and large files may take longer to process.
Delete files by name, extension, or age β with preview and confirmation.
This command lets you clean up your project or directory by deleting files:
- Matching a name pattern (substring in filename)
- With a specific extension (e.g.,
log
,tmp
) - Older than a chosen number of days
- Or any combination of the above
Preview and confirmation options make clean-up safe and transparent.
Use it to remove logs, backups, temporary files, build artifacts, or any clutter!
shellman clean_files [OPTIONS]
Options
--path PATH
Directory to scan (default: current directory .)
--ext EXT
Delete files with this extension (e.g., --ext log)
--name PATTERN
Delete files whose name contains this pattern (substring)
--older-than DAYS
Delete only files older than N days
--dry-run
Preview: list files that would be deleted, but do not delete them
--confirm
Ask before deleting each file (Y/n prompt for each)
--lang-help {pl,eng}
Show this help in Polish or English
- Preview all .log files to be deleted in current directory
shellman clean_files --ext log --dry-run
- Delete .tmp files older than 30 days (with confirmation prompt)
shellman clean_files --ext tmp --older-than 30 --confirm
- Remove all files containing backup in their name (no confirmation)
shellman clean_files --name backup
- Delete .bak files in /var/data and preview before deleting
shellman clean_files --path /var/data --ext bak --dry-run
- Remove .old files containing report in the name, older than 60 days
shellman clean_files --ext old --name report --older-than 60 --dry-run
At least one of --ext or --name is required.
--older-than is optional; applies additional filtering by modification time.
Use --dry-run first to preview which files would be deleted.
Use --confirm to prompt for each file before deletion (safer).
Matching is case-sensitive.
Deletion is permanent β files are not moved to trash/recycle bin!
Extract specific columns or rows from a CSV file, with powerful filtering and export options.
This command helps you quickly extract and filter data from CSV files, such as:
- Selecting specific columns (by number)
- Extracting a range of rows
- Keeping or excluding rows based on text filters
- Skipping the header row if needed
- Saving output to a file or viewing interactively
Works with all CSV-like files (you can set the delimiter).
shellman csv_extract FILE --cols COLUMNS [OPTIONS]
Arguments:
FILE β path to the CSV file to process (required)
Options:
--cols COLS
Required. Columns to keep (1-based), e.g. 1,3 or 2-4.
--rows ROWS
Only keep selected rows (after header). Format: 2-10 or 1,3,5.
--contains TEXT
Only keep rows containing this text (case-insensitive).
--not-contains TEXT
Only keep rows NOT containing this text.
--delim CHAR
CSV delimiter (default: ,).
--skip-header
Skip the first line (header row).
--output FILE
Save the result to a file instead of printing.
--interactive
Pipe the output to a pager (e.g., less).
--lang-help {pl,eng}
Show this help in Polish or English.
- Extract columns 2 and 4 from a CSV
shellman csv_extract data.csv --cols 2,4
- Extract columns 1β3 and only rows 2 to 10 (excluding the header)
shellman csv_extract data.csv --cols 1-3 --rows 2-10 --skip-header
- Export rows containing "error" in any column
shellman csv_extract data.csv --cols 1,2 --contains error
- Export rows not containing "backup", save to file
shellman csv_extract data.csv --cols 1,3 --not-contains backup --output filtered.csv
- Preview large output interactively
shellman csv_extract data.csv --cols 1,2,3 --interactive
Both --cols and file argument are required.
Columns and rows are 1-based (first column = 1).
You can combine column and row selection, with or without filtering.
Use either --contains or --not-contains (not both at once).
For custom delimiters (e.g., ;), use --delim ";".
--skip-header only skips the first line (useful if your CSV has a header).
Output is printed to the terminal unless --output is given.
Work with dates: add/subtract time, compare or format β flexible CLI date/time helper.
This command allows you to:
- Add or subtract time from a given date (or now)
- Compare two dates (shows difference in days, hours, minutes, seconds)
- Format dates using any strftime pattern
- Parse dates in common formats:
YYYY-MM-DD
orYYYY-MM-DD HH:MM:SS
Great for scripting, reporting, or quick date math in the shell.
shellman date_utils [OPTIONS]
Options
--date DATE
Base date (default: now). Accepted formats: 2024-07-08 or 2024-07-08 15:30:00.
--add DURATION
Add time to base date. Format: e.g. 5d, 3w, 2h, 10min, 1m, 1y, 2q, 45s. Supported units:
d = days
w = weeks
m = months
y = years
q = quarters (3 months)
h = hours
min = minutes
s = seconds
--sub DURATION
Subtract time from base date (same formats/units as above).
--diff DATE
Show difference between base date and this date (in days, hours, minutes, seconds).
--format PATTERN
Format the date/result using any strftime pattern, e.g. %Y-%m-%d, %d/%m/%Y, %A, %H:%M.
--lang-help {pl,eng}
Show this help in Polish or English.
- Add 3 days to now
shellman date_utils --add 3d
- Subtract 2 months from specific date
shellman date_utils --date 2023-10-01 --sub 2m
- Show difference between two dates
shellman date_utils --date 2024-01-01 --diff 2024-07-08
- Format current date in custom format
shellman date_utils --format "%A, %d-%m-%Y %H:%M"
- Chain calculations and formatting
shellman date_utils --date 2024-01-01 --add 1y --format "%Y/%m/%d"
Date input must be YYYY-MM-DD or YYYY-MM-DD HH:MM:SS
Only one --add or --sub operation at a time
Units are case-insensitive, e.g. 10MIN is valid
Use strftime for custom output (see docs for all codes)
Prints a visual tree of directories (like the Unix tree
command) β with options for files, depth, ASCII, and more.
This command displays a structured tree view of your directory and subdirectories:
- See the full hierarchy at a glance
- Optionally include files, not just folders
- Limit the depth of recursion for huge folders
- Save the tree as text to a file
- Show hidden files/folders or stick to visible ones
- Choose between Unicode and pure ASCII line graphics
Ideal for exploring codebases, reporting directory structure, or documenting projects.
shellman dir_tree [PATH] [OPTIONS]
Arguments:
PATH β directory to scan (default: current directory .)
Options:
--files
Include files (not just directories) in the tree.
--depth N
Limit recursion to N levels deep.
--output FILE
Save the tree output to a text file.
--hidden
Include hidden files and folders (names starting with .).
--ascii
Use ASCII-only box drawing (for old terminals or code comments).
--lang-help {pl,eng}
Exclude patterns (folder names, file names, or extensions, e.g. pycache, *.txt, *.pyc)
--exclude
Show this help in Polish or English.
- Print the full directory tree (folders only)
shellman dir_tree
- Show directories and files up to 2 levels deep
shellman dir_tree --files --depth 2
- Include hidden files and folders, ASCII lines
shellman dir_tree --files --hidden --ascii
- Save tree view of /home/user/project to tree.txt
shellman dir_tree /home/user/project --output tree.txt
By default, only folders are shown β add --files for full content.
Maximum depth is unlimited unless set by --depth.
Some systems may have permission errors on protected folders β these will be marked.
Encrypt or decrypt files in a folder using strong AES-256 encryption with a password.
This command allows you to:
- Encrypt all matching files in a directory (optionally filter by extension) using AES-256 and a password.
- Decrypt
.enc
files with the correct password. - Output files are saved in separate directories (
encrypted/
ordecrypted/
by default, or custom with--out
).
The encryption uses AES-256 in CBC mode, with a randomly generated salt and IV per file. Key is derived using PBKDF2-HMAC-SHA256 with 100,000 iterations.
Ideal for backups, archiving, sending files securely, or protecting secrets in projects.
shellman encrypt_files --mode {encrypt|decrypt} --password PASS [OPTIONS]
Options
--mode {encrypt, decrypt}
Required. Choose encryption or decryption mode.
--password PASSWORD
Required. Password to encrypt/decrypt files (do not lose this!).
--ext EXT
Only process files with this extension (e.g. --ext txt).
--path DIR
Directory to scan (default: current directory).
--out DIR
Output directory (default: encrypted/ or decrypted/).
--lang-help {pl,eng}
Show this help in Polish or English.
- Encrypt all files in current directory
shellman encrypt_files --mode encrypt --password mySecret
- Encrypt only .pdf files in docs/, save to my_encrypted/
shellman encrypt_files --mode encrypt --password MyPass123 --path docs --ext pdf --out my_encrypted
- Decrypt all .enc files in backup/ using a password
shellman encrypt_files --mode decrypt --password mySecret --path backup
Password is never stored β losing it means files are unrecoverable.
Decrypted files are restored without the .enc extension.
You must specify --mode and --password for all operations.
Key derivation uses 100,000 PBKDF2 iterations for good security.
Each file is encrypted separately, with a unique salt and IV.
Utilities for working with Excel files: quick structure checks, table previews, and CSV export.
- excel info β list sheets with row/column counts
- excel preview β preview the first N rows / selected columns
- excel export β export sheets or ranges to CSV
Quick examples:
shellman excel info report.xlsx
shellman excel preview data.xlsx --sheet 2 --rows 10 --columns A,C-E -i
shellman excel export workbook.xlsx -s 1 -s 3 -o out --overwrite
List all sheets in an .xlsx file with the number of rows and columns.
shellman excel info FILE.xlsx [--lang-help pl|eng]
Example output:
Sheet Rows Cols
----------------------------------
Sheet1 125 8
Summary 12 4
Options:
--lang-help pl|eng β show localized help text instead of executing
Preview the first N rows of a sheet (by index or name) with optional column selection. Can render as a neatly aligned table (row numbers, | separators) or raw CSV. Supports saving to a file or paging interactively.
shellman excel preview FILE.xlsx \
[--sheet NAME_OR_NUM] [--rows N] [--columns SPEC] \
[--output FILE] [--interactive] [--info|-i] [--lang-help pl|eng]
Options
--sheet TEXT β sheet name or 1-based index (default: 1)
--rows INT β number of rows to preview (default: 20)
--columns TEXT β column spec like A,C-E
--output PATH β save result to a file (text)
--interactive β pipe to pager (less -S)
--info, -i β show column headers, row numbers, and | separator
--lang-help pl|eng β localized help
Examples:
# 10 rows from sheet #2, columns A and CβE, pretty table:
shellman excel preview data.xlsx --sheet 2 --rows 10 --columns A,C-E -i
# Save first 50 rows of sheet "Report" to out.csv:
shellman excel preview data.xlsx --sheet Report --rows 50 --output out.csv
Column spec supports single letters and ranges, e.g., A,B-D.
With -i, output is width-aligned and row-numbered.
Export one or more sheets (optionally narrowed by row/column ranges) to CSV files. Each sheet becomes a separate CSV. Without --overwrite, a timestamp is appended to filenames.
shellman excel export FILE.xlsx \
[--sheets|-s NAME_OR_INDEX ...] [--rows|-r START-END] [--columns|-c SPEC] \
[--out|-o DIR] [--overwrite|-ow] [--lang-help|-lh pl|eng]
Options
--sheets, -s TEXT (repeatable) β sheet names or indexes to export; omit to export all
--rows, -r TEXT β row range start-end (e.g., 2-100)
--columns, -c TEXT β columns like A,B-D
--out, -o PATH β output directory (default: csv)
--overwrite, -ow β overwrite (no timestamp in filenames)
--lang-help, -lh pl|eng β localized help
Examples:
# Export all sheets to csv/:
shellman excel export data.xlsx
# Export sheet "Data", rows 2β50, columns AβE, without timestamp:
shellman excel export data.xlsx -s Data -r 2-50 -c A-E -o csv --overwrite
# Export sheets 1 and 3 to out/:
shellman excel export workbook.xlsx -s 1 -s 3 -o out
Notes
Supported input: .xlsx One CSV per sheet Without --overwrite, a timestamp is appended to avoid collisions
Convert between JSON, YAML, and TOML formats in a single step.
Convert data files between popular formats:
- JSON β YAML
- JSON β TOML
- YAML β TOML
You can pretty-print results, output to a file, or page interactively.
Useful for config migrations, converting data for different tools, or normalizing format style.
shellman file_convert FILE --from FORMAT --to FORMAT [OPTIONS]
Arguments & Options FILE Input file path (required)
--from FORMAT
Required. Input format: json, yaml, or toml
--to FORMAT
Required. Output format: json, yaml, or toml
--output FILE
Save converted data to a file
--pretty
Pretty-print output (only for JSON/YAML)
--interactive
Pipe output to a pager (e.g., less)
--lang-help {pl,eng}
Show this help in Polish or English
- Convert YAML to JSON, print result
shellman file_convert config.yaml --from yaml --to json --pretty
- Convert TOML to YAML and save to file
shellman file_convert pyproject.toml --from toml --to yaml --output config.yaml
- Convert JSON to TOML, view output interactively
shellman file_convert settings.json --from json --to toml --interactive
You must provide both --from and --to formats.
Only valid UTF-8 text files are supported.
Pretty printing is available for JSON and YAML only.
The tool will try to preserve structure and comments (where format allows).
Show full path, file size, line-count, and extension for each file β with optional metadata.
This command collects and displays statistics for files:
- Shows path, number of lines, file size, and extension
- Supports recursive scan of directories
- Optionally, prints file metadata (creation & modification dates, type, encoding)
- Supports extension filtering, result saving to logs
Ideal for code audits, data checks, repo hygiene, or quick file reporting.
shellman file_stats [PATHS...] [OPTIONS]
Arguments:
PATHS... List of files or directories (can be mixed; required)
Options:
--ext EXT
Only include files with this extension (e.g. py)
--meta
Include metadata: created, modified, type, encoding
--output
Save results to logs/file_stats_.log
--lang-help {pl,eng}
Show this help in Polish or English
- Show stats for a single file
shellman file_stats main.py
- Recursively report on all .txt files in docs/
shellman file_stats docs --ext txt
- Show stats for all files in several folders and print metadata
shellman file_stats src test --meta
- Save a full report to a log file
shellman file_stats mydir --output
Accepts multiple files or directories at once.
Directory arguments are searched recursively.
For binary files, line-count may not be meaningful.
Find files by name, extension, or content β with flexible filtering and output options.
This command lets you search for files in a directory tree using:
- Partial name matches (substring in filename)
- File extension (e.g. only
.py
or.txt
) - File content (find all files containing specific text)
- Any combination of the above
You can also show file sizes and export results to a log file.
shellman find_files SEARCH_PATH [OPTIONS]
Arguments & Options SEARCH_PATH Directory to search (required; recursion is automatic)
--name FRAGMENT
Match filenames containing this substring
--content TEXT
Search for files containing this text (UTF-8)
--ext EXT
Only include files with this extension (e.g. --ext py)
--output
Save results to logs/find_files_.log
--show-size
Show file size (in KB or MB) next to each result
--lang-help {pl,eng}
Show this help in Polish or English
- Find all files with "log" in the name under ./data
shellman find_files ./data --name log
- Find .md files containing the word "draft"
shellman find_files docs --ext md --content draft
- Find all Python files in a folder and show their size
shellman find_files src --ext py --show-size
- Save all .csv files found in exports to a log file
shellman find_files exports --ext csv --output
SEARCH_PATH must be an existing directory.
Multiple filters can be combined for precise results.
Content search is case-sensitive and only supports UTF-8 text files.
Binary files or those with read errors are skipped.
Recursion is always enabled (scans all subfolders).
Extract and filter JSON data with optional field selection.
This command lets you:
- Extract data from JSON files, navigating nested objects/lists with a dot-separated path
- Filter list entries by field value (e.g., only objects with
status=active
) - Select fields to include in the output (e.g., only show
id,name
) - Output results as JSON lines (one object per line), to screen or a file, optionally with paging
Perfect for quickly slicing, dicing, and filtering big JSONs, logs, or API exports.
shellman json_extract FILE [OPTIONS]
Options FILE Input JSON file (required)
--path KEY.PATH
Dot-separated path to a list or object, e.g. items, data.results, root.items
--filter KEY=VALUE
Only keep items where KEY == VALUE (string comparison)
--fields FIELD1,FIELD2,...
Output only these fields
--output FILE
Save output to file instead of printing
--interactive
Pipe output to a pager (e.g. less)
--lang-help {pl,eng}
Show this help in Polish or English
- Extract top-level list from data.json
shellman json_extract data.json
- Extract items from a nested path and filter by key
shellman json_extract data.json --path data.items --filter status=active
- Output only certain fields as JSON lines
shellman json_extract data.json --path users --fields id,name
- Save filtered results to a file and preview interactively
shellman json_extract logs.json --path logs --filter level=ERROR --fields time,message --output errors.jsonl --interactive
If the path does not resolve to a list, the result is wrapped in a list for uniform handling.
Filter uses simple string equality.
If you use --fields, only those fields are kept in each result object.
Output is always one JSON object per line (JSONL format).
Works only with valid UTF-8 JSON files.
Extract, count or summarize lines in files and folders β with powerful filters and context options.
This command lets you:
- Extract matching lines from files (with before/after context)
- Count lines matching filters or regular expressions
- Summarize files: total files, total lines, total matches
- Filter by content, regex, or extension
- Show percentages, file sizes, and save to logs
- Combine multiple options for advanced text/file analytics
shellman lines [FILES OR FOLDERS...] [OPTIONS]
Options
--extract
Print matching lines (default if no mode is chosen)
--count
Show only the number of matching lines
--summary
Print a summary: total files, total lines, matches
--contains TEXT
Only include lines containing this text
--not-contains TEXT
Only include lines NOT containing this text
--regex PATTERN
Use a regular expression to match lines
--ignore-case
Case-insensitive matching
--before N
Show N lines before each match (context)
--after N
Show N lines after each match (context)
--ext EXT
Only include files with this extension
--percent
Show percentage of matches vs total lines
--output FILE
Save results to a file
--interactive
Pipe results to pager (e.g. less)
--show-size
Show file size next to results
--lang-help {pl,eng}
Show this help in Polish or English
- Print all lines containing "TODO" from .py files
shellman lines src --ext py --contains TODO
- Count the number and percent of lines matching a regex
shellman lines logs.txt --regex "ERROR.*" --count --percent
- Extract lines (with 2 lines before and 1 after each match) from a folder
shellman lines logs/ --contains WARNING --before 2 --after 1
- Show summary of all .txt files in a directory
shellman lines data --ext txt --summary
- Save matching lines to a log and preview interactively
shellman lines myfile.txt --contains "secret" --output results.log --interactive
By default, --extract is used if no mode is given.
You cannot use --contains and --regex together.
Inputs can be a mix of files and folders (folders searched recursively).
If a filter is not set, all lines are included.
The --before and --after options add context lines to each match.
Non-UTF-8 files may be read with errors ignored.
Show currently open TCP/UDP ports and serial ports (COM/LPT/tty) with process, PID, state, and device description.
This command allows you to:
- List all open TCP and UDP ports with associated process name and PID
- Filter by protocol (
tcp
orudp
) or by local port number - Display in a readable table or as raw JSON (for scripting/automation)
- List physical serial/parallel ports (COM, LPT, tty, cu) with human-readable device descriptions (cross-platform)
Supports Linux, macOS, and Windows. Requires psutil
and pyserial
for full features.
shellman open_ports [OPTIONS]
Options
--proto {tcp, udp}
Filter only TCP or only UDP ports
--port N
Filter by local port number (e.g. --port 8080)
--json
Output raw JSON for open TCP/UDP sockets (no table)
--serial
Show available serial/parallel ports (COM, LPT, tty, cu) with device descriptions
--lang-help {pl,eng}
Show this help in Polish or English
- List all open TCP and UDP ports with process info
shellman open_ports
- Show only open TCP ports
shellman open_ports --proto tcp
- Show all processes listening on port 5432
shellman open_ports --port 5432
- Output open ports as JSON (for scripts, logs, or audits)
shellman open_ports --json
- List all physical serial/parallel ports with device description
shellman open_ports --serial
TCP/UDP info is gathered using the psutil library.
Serial port listing requires pyserial; falls back to device globbing on POSIX.
On first run, psutil may auto-install if missing (internet required).
If running without permissions, you may not see all ports/processes.
Output as JSON includes all fields: protocol, pid, process, local/remote addresses, state.
Replace occurrences of text in multiple files β with preview, diff, and interactive confirmation.
This command allows you to:
- Search and replace specific text in many files at once (recursively in a folder)
- Filter files by extension
- Preview changes as unified diffs before saving
- Confirm each replacement interactively
- Safely batch-edit configs, code, docs, notes, and more
shellman replace_text SEARCH_PATH --find TEXT --replace TEXT [OPTIONS]
Options SEARCH_PATH Directory to search (required)
--find TEXT
Required. Text to find
--replace TEXT
Required. Replacement text
--ext EXT
Only process files with this extension
--in-place
Write changes back to files (otherwise preview only)
--preview
Show unified diff preview of proposed changes
--confirm
Ask interactively before replacing in each file
--lang-help {pl,eng}
Show this help in Polish or English
- Replace all "foo" with "bar" in .md files under docs/ (preview only)
shellman replace_text docs --find foo --replace bar --ext md --preview
- Replace "localhost" with "127.0.0.1" in config files, with interactive confirmation
shellman replace_text config --find localhost --replace 127.0.0.1 --ext conf --in-place --confirm
- Bulk update copyright year in project
shellman replace_text . --find "Copyright 2023" --replace "Copyright 2024" --ext py --in-place --preview
Changes are only made if --in-place is given; otherwise, nothing is written.
You can combine --preview and --in-place to see diffs before changes.
If --confirm is used, you will be prompted (Y/n) for each file.
Only files containing the search text are processed.
Recursively searches all subfolders.
Only processes text files (UTF-8, errors ignored).
Extension filtering matches the file suffix (e.g., .md).
Run a quick internet speed test (download, upload, ping) β using the fastest available method on your platform.
This command performs an internet speed test and displays:
- Download speed (Mbps)
- Upload speed (Mbps)
- Ping (ms)
It automatically selects the best available backend:
- Ookla CLI (
speedtest
binary, official) - speedtest-cli (Python binary)
- speedtest-cli (Python module, auto-installs if missing)
Results are printed as text or raw JSON β great for reports, monitoring, or troubleshooting.
shellman speed_test [OPTIONS]
Options
--json
Output raw JSON with download, upload, and ping
--only {download, upload, ping}
Show only one metric (e.g. --only download)
--lang-help {pl,eng}
Show this help in Polish or English
- Run a full speed test (download/upload/ping)
shellman speed_test
- Show only download speed
shellman speed_test --only download
- Output results as raw JSON
shellman speed_test --json
For best accuracy, the official speedtest binary from Ookla will be used if available.
Falls back to Python speedtest-cli if necessary (may auto-install).
Actual speeds depend on your current connection, time of day, and server location.
The test may take a few seconds to complete.
On first use, an internet connection is needed to install missing backends.
Show system, shell, and tool environment summary β cross-platform hardware & OS info for Linux, macOS, Windows.
This command gives you a complete summary of your system, including:
- OS, version, architecture, host and WSL info
- Serial number (where supported)
- CPU: model, core count, frequency
- Sensors/temperature: (when supported)
- GPU: detected graphics cards
- Battery: charge status, percent, time remaining
- Bluetooth: detected devices/adapters
- Tools:
python3
,jq
,xlsx2csv
, etc. - Memory: total/used/available
- Uptime & Load: since last boot, load averages
- Disks: usage, free space per partition
- Network: local/public IP, interfaces
- Packages: package manager type, package count
- Printers: detected printers
- Wi-Fi: connected SSID (if available)
- Displays: screen resolutions, number of displays
Works on Linux, macOS, and Windows, with automatic adaptation to platform capabilities.
shellman sys_summary [--lang-help pl|eng]
π System Summary
ββββββββββββββββββββββββββββββββββββββββββββ
π₯οΈ OS & Host
ββββββββββββββββββββββββββββββββββββββββββββ
System : Linux
Distro : Ubuntu 22.04
Version : 5.15.0-89-generic
Architecture : x86_64
WSL : No
Hostname : my-laptop
π Serial Number
ββββββββββββββββββββββββββββββββββββββββββββ
Serial : ABCD1234EFGH
π¦Ύ CPU
ββββββββββββββββββββββββββββββββββββββββββββ
Cores : 8 (logical: 16)
Frequency : 3200.0 MHz
Model : Intel(R) Core(TM) i7-1165G7 CPU @ 2.80GHz
π‘οΈ Sensors / Temperature
ββββββββββββββββββββββββββββββββββββββββββββ
coretemp: temp = 48.0 Β°C
π₯οΈ GPU
ββββββββββββββββββββββββββββββββββββββββββββ
GPU : Intel Iris Xe Graphics
π Battery
ββββββββββββββββββββββββββββββββββββββββββββ
Percent : 85%
Plugged in : Yes
Time left : 2:34:00
π‘ Bluetooth
ββββββββββββββββββββββββββββββββββββββββββββ
Powered: yes
π οΈ Tools
ββββββββββββββββββββββββββββββββββββββββββββ
python3 : /usr/bin/python3
jq : /usr/bin/jq
xlsx2csv : /usr/local/bin/xlsx2csv
π§ Memory
ββββββββββββββββββββββββββββββββββββββββββββ
total used free shared buff/cache available
Mem: 15Gi 3.7Gi 8.1Gi 400Mi 3.3Gi 11Gi
Swap: 2.0Gi 0.0Ki 2.0Gi
β±οΈ Uptime & Load
ββββββββββββββββββββββββββββββββββββββββββββ
Uptime : up 4 days, 5 hours, 12 minutes
Load Avg. : 0.65, 0.45, 0.35
πΎ Disks
ββββββββββββββββββββββββββββββββββββββββββββ
Filesystem Size Used Avail Use% Mounted on
/dev/nvme0n1p2 470G 60G 387G 14% /
π Network
ββββββββββββββββββββββββββββββββββββββββββββ
Local IP : 192.168.0.13
Public IP : 83.12.45.67
π¦ Packages
ββββββββββββββββββββββββββββββββββββββββββββ
Pkg Manager : apt
Total pkgs : 2024
π¨οΈ Printers
ββββββββββββββββββββββββββββββββββββββββββββ
Printer : HP_LaserJet
πΆ Wi-Fi
ββββββββββββββββββββββββββββββββββββββββββββ
Connected SSID: Domowy_5G
π₯οΈ Displays
ββββββββββββββββββββββββββββββββββββββββββββ
eDP-1 connected primary 1920x1080+0+0
HDMI-1 disconnected
Some fields require extra permissions (e.g., serial number, sensors, dmidecode).
Some info may be unavailable or limited depending on platform/hardware.
Battery, sensors, Bluetooth, printers, Wi-Fi may not be present on servers or VMs.
For full hardware info on Windows, run from PowerShell/CMD (not Git Bash).
Create zip archives from folders or files in batch, with extension filtering and optional password.
Quickly compress and archive many files or folders at once:
- Archive all files (optionally filtered by extension) in a source directory
- Per-folder mode: Create one ZIP per immediate subfolder
- Custom name prefix for archives
- Password protection (warning: ZIPβs
-P
is weak!) - Output goes to the selected directory (default:
./zips
)
Requires the zip
command-line tool in your system.
shellman zip_batch [OPTIONS]
Options
--path DIR
Source directory to scan (default: current directory)
--ext EXT
Only include files with this extension (e.g. --ext txt)
--per-folder
Create one archive per immediate sub-folder (instead of one big archive)
--output DIR
Output directory for ZIPs (default: ./zips)
--name PREFIX
Prefix for archive filenames (default: batch_)
--password PASS
Set ZIP password (uses zip -P; weak encryption)
--lang-help {pl,eng}
Show this help in Polish or English
- Archive all files in reports/ as one ZIP
shellman zip_batch --path reports
- Create separate ZIPs for each folder in datasets/ (per-folder mode)
shellman zip_batch --path datasets --per-folder --name data_
- Archive only .csv files and save to myzips/
shellman zip_batch --ext csv --output myzips
- Create a password-protected ZIP of all .pdf files
shellman zip_batch --ext pdf --password MySecret123
Requires the zip utility (Linux/macOS/WSL/Windows with zip.exe).
Password protection (-P) in zip is weak β do not use for strong security!
--per-folder makes one archive per each direct subdirectory (ignores files at root level).
Filenames have the given prefix (default: batch_) and a timestamp for batch mode.
Pull-requests and feature ideas are welcome! Each command lives in shellman/commands/ β feel free to add your own tool.