10th Week of 2022
Coding⚑
Generic Coding Practices⚑
Use warnings to evolve your code⚑
-
New: Using warnings to evolve your package.
Regardless of the versioning system you're using, once you reach your first stable version, the commitment to your end users must be that you give them time to adapt to the changes in your program. So whenever you want to introduce a breaking change release it under a new interface, and in parallel, start emitting
DeprecationWarning
orUserWarning
messages whenever someone invokes the old one. Maintain this state for a defined period (for example six months), and communicate explicitly in the warning message the timeline for when users have to migrate.This gives everyone time to move to the new interface without breaking their system, and then the library may remove the change and get rid of the old design chains forever. As an added benefit, only people using the old interface will ever see the warning, as opposed to affecting everyone (as seen with the semantic versioning major version bump).
Python⚑
-
New: Add humanize library.
humanize: This modest package contains various common humanization utilities, like turning a number into a fuzzy human-readable duration ("3 minutes ago") or into a human-readable size or throughput.
Boto3⚑
Code Styling⚑
-
New: Solve W1514 pylint error.
with open('file.txt', 'r', encoding='utf-8'):
FastAPI⚑
-
Probably an exception was raised in the backend, use
pdb
to follow the trace and catch where it happened.
Python Snippets⚑
-
New: How to Find Duplicates in a List in Python.
numbers = [1, 2, 3, 2, 5, 3, 3, 5, 6, 3, 4, 5, 7] duplicates = [number for number in numbers if numbers.count(number) > 1] unique_duplicates = list(set(duplicates))
If you want to count the number of occurrences of each duplicate, you can use:
from collections import Counter numbers = [1, 2, 3, 2, 5, 3, 3, 5, 6, 3, 4, 5, 7] counts = dict(Counter(numbers)) duplicates = {key:value for key, value in counts.items() if value > 1}
To remove the duplicates use a combination of
list
andset
:unique = list(set(numbers))
-
New: How to decompress a gz file.
import gzip import shutil with gzip.open('file.txt.gz', 'rb') as f_in: with open('file.txt', 'wb') as f_out: shutil.copyfileobj(f_in, f_out)
-
New: How to compress/decompress a tar file.
def compress(tar_file, members): """ Adds files (`members`) to a tar_file and compress it """ tar = tarfile.open(tar_file, mode="w:gz") for member in members: tar.add(member) tar.close() def decompress(tar_file, path, members=None): """ Extracts `tar_file` and puts the `members` to `path`. If members is None, all members on `tar_file` will be extracted. """ tar = tarfile.open(tar_file, mode="r:gz") if members is None: members = tar.getmembers() for member in members: tar.extract(member, path=path) tar.close()
DevOps⚑
Infrastructure Solutions⚑
Krew⚑
-
New: Introduce krew.
Krew is a tool that makes it easy to use kubectl plugins. Krew helps you discover plugins, install and manage them on your machine. It is similar to tools like apt, dnf or brew.
Ksniff⚑
-
New: Introduce Ksniff.
Ksniff is a Kubectl plugin to ease sniffing on kubernetes pods using tcpdump and wireshark.
Mizu⚑
-
New: Introduce mizu.
Mizu is an API Traffic Viewer for Kubernetes, think
TCPDump
and Chrome Dev Tools combined.
Debugging⚑
-
New: How to debug kubernetes network traffic.
Sometimes you need to monitor the network traffic that goes between pods to solve an issue. There are different ways to see it:
- Using Mizu
- Running tcpdump against a running container
- Using ksniff
- Using ephemeral debug containers
Of all the solutions, the cleaner and easier is to use Mizu.
Operative Systems⚑
Linux⚑
Linux Snippets⚑
-
Correction: Clean snap data.
If you're using
snap
you can clean space by:- Reduce the number of versions kept of a package with
snap set system refresh.retain=2
-
Remove the old versions with
clean_snap.sh
#!/bin/bash #Removes old revisions of snaps #CLOSE ALL SNAPS BEFORE RUNNING THIS set -eu LANG=en_US.UTF-8 snap list --all | awk '/disabled/{print $1, $3}' | while read snapname revision; do snap remove "$snapname" --revision="$revision" done)
- Reduce the number of versions kept of a package with
-
Correction: Clean journalctl data.
- Check how much space it's using:
journalctl --disk-usage
- Rotate the logs:
journalctl --rotate
Then you have three ways to reduce the data:
- Clear journal log older than X days:
journalctl --vacuum-time=2d
- Restrict logs to a certain size:
journalctl --vacuum-size=100M
- Restrict number of log files:
journactl --vacuum-files=5
.
The operations above will affect the logs you have right now, but it won't solve the problem in the future. To let
journalctl
know the space you want to use open the/etc/systemd/journald.conf
file and set theSystemMaxUse
to the amount you want (for example1000M
for a gigabyte). Once edited restart the service withsudo systemctl restart systemd-journald
. - Check how much space it's using:
-
New: Set up docker logs rotation.
By default, the stdout and stderr of the container are written in a JSON file located in
/var/lib/docker/containers/[container-id]/[container-id]-json.log
. If you leave it unattended, it can take up a large amount of disk space.If this JSON log file takes up a significant amount of the disk, we can purge it using the next command.
truncate -s 0 <logfile>
We could setup a cronjob to purge these JSON log files regularly. But for the long term, it would be better to setup log rotation. This can be done by adding the following values in
/etc/docker/daemon.json
.{ "log-driver": "json-file", "log-opts": { "max-size": "10m", "max-file": "10" } }