I suppose we do not even know exact reasons of decline of wildlife population. Quite definitely it is due massive intervention of human activity, but which aspects exactly? Light pollution might play big role in insect population decline.
This is a push for privacy and it is fundamentally pushes in the opposite direction from let's say "forming accurate knowledge about the world".
How can we combat being mislead by false AI generated images? I'd say keeping track of provenance is what we should adopt, at least as an option. I hope we will find solutions to propagate images over the net reliably keeping how, when and where they were taken.
This is far from the first time that I see on HN indignation on LaLiga blockings. Sadly all this rage does not seem to lead to any change.
I'd like to suggest some steps that might/should be followed, which I will not pursue personally but in my defense - I do not live in Spain and not affected.
1) (first! low-effort) Somebody should create any space on the internet, where such anecdotes might shared and probably people with common goals of fixing internet access in Spain will meet. E.g. telegram group, discord channel, subreddit...
2) probably create wiki with related research: legal framework and possible actions etc
3) Raise public awareness. Create a resource/website with schedule of past and future "semi-blackouts", simple explanation of possible effects a layman may notice etc
4) Explore legal actions that might be taken. How this issue might be forced to be discussed by politicians? For instance I know that Portugal has official mechanism to put forward petitions, that will be discussed in parliament if get enough votes [1]
Space of possible demands in such petitions is vast. For instance:
- Make LaLiga compensate partly price of internet access
- Force LaLiga to include education notice in the beginning and the of translation with title like "Start of reduced internet connectivity" / "End of reduced internet connectivity"
Humankind is not doing well with implementing new policies. We should really strive for each new policy (like in this case - blocking access to some parts of internet during soccer games):
- Consider running policy in small scale scenario (e.g. testing blocking in small parts of Spain before whole country rollout)
- Implement channels to gather info from those who are faced with results of policy implementation (in this case: the op got webpage with description why the page is blocked - a bit of sanity! It would be better if it was served with HTTP code 451)
- Policy instructions
- When deciding on policy put a date at which policy should be reconsidered and revised using data collected during the time when it was in effect
- ... and some more I have not thought about.
Let's strive to cultivate this principles in all life areas where we can affect how new policies are implemented.
I think that in most cases jq is launched to extract value from relatively small JSON document, for which raw parsing speed is not affect much. jq is just really slow to start. Version 1.6 was especially abysmally slow to start, 10x times slower than 1.5:
So any replacement candidate should also benchmark like hyperfine "jq .a <<< '{"a": 10 }'" . This oneliner does not work but should illustrate the idea.
Also please just use jshon if you need to just extract specific value from some small JSON. jshon uses way less resources by any conceivable metric.
It seems this was possible because ripgrep is inefficient in CPU usage when runs multithreaded and uses about 2x times more CPU time in comparison to GNU grep.
I use simpler solution (measuring by number of taps on the screen): share place from google maps to https://f-droid.org/packages/page.ooooo.geoshare which can convert it to actual latitude/longitude which in turn can be shared to any app working with locations: OsmAnd, Organic Maps, Uber, ...
One part of me likes this solution for being faster and elegant, and I've bookmarked it to be able to recommend to friends. But another part of me is frustrated that so many everyday computer users have little-to-no awareness of basic features like cut/copy/paste on mobile, resulting in another app install as a solution.
Not trying to imply this about you in particular, just griping that the general lack of awareness about how to take advantage of what should be fundamental/foundational OS features means that whole apps get written to, in essence, duplicate those features.
Often desktop client just cannot connect to mobile.
At first I noticed that this happens when desktop client starts to output in logs these (reported [1]):
kdeconnect.core: Too many remembered identities, ignoring "<id of KDE Connect on my android device>" received via UDP
Restarting desktop client helped, so I wrote watcher that monitors logs for such lines and restarts kdeconnect. But it turned out to be insufficient. Now I have this script running in background to restart kdeconnectd whenever connection is missing, and finally can use KDE Connect reliably:
#!/bin/dash -x
while sleep 1m; do
nmcli connection show --active | grep wifi || continue
kdeconnect-cli -l | grep reachable && continue
# notify-send 'No reachable devices via kdeconnect. Restarting'
systemctl --no-pager --user status app-org.kde.kdeconnect.daemon@autostart.service
systemctl --no-pager --user stop app-org.kde.kdeconnect.daemon@autostart.service
killall kdeconnectd
systemctl --no-pager --user start app-org.kde.kdeconnect.daemon@autostart.service
done
reply