We are using Apples smartwatch to measure some health stats during our workouts. And Apple Watch is doing a great job at that.
With all that polish one would expect better from what Apple has to offer in the software department…
Apple Watch has monthly challenges that get automatically generated from previous measurements. But seeing that an already much above average activities number would have to be doubled to complete the challenge is absurd. To a degree where challenges are arguably health risks…
I am long-time subscriber to a service that is delivering a curated choice of scientific papers to your inbox every morning.
And even better: On top of the choice and link of the paper you also get a great summary with additional links and hints on the topic.
The Morning Paper: a short summary every weekday of an important, influential, topical or otherwise interesting paper in the field of computer science.
Depending on your specific interests the papers chosen will give you deep insights into certain topics. Recently a lot of AI related topics show up there.
The papers are delivered by eMail, by RSS feed of by just reading the blog.
Dicordianism is a paradigm based upon the book Principia Discordia, written by Greg Hill with Kerry Wendell Thornley in 1963, the two working under the pseudonyms Malaclypse the Younger and Omar Khayyam Ravenhurst.
According to its primary historian, Adam Gorightly, Discordianism was founded as a parody religion. Many outside observers still regard Discordianism as a parody religion, although some of its adherents may utilize it as a legitimate religion or as a metaphor for a governing philosophy.
The Principia Discordia, if read literally, encourages the worship of Eris, known in Latin as Discordia, the goddess of disorder, or archetypes and ideals associated with her. Depending on the version of Discordianism, Eris might be considered the goddess exclusively of disorder or the goddess of disorder and chaos.
Both views are supported by the Principia Discordia. The Principia Discordia holds three core principles: the Aneristic (order), the Eristic (disorder), and the notion that both are mere illusions.
Due to these principles, a Discordian believes there is no distinction between disorder and chaos, since the only difference between the two is that one refers to ‘order’.
This is likely a major reason for the inconsistency in the wording. An argument presented by the text is that it is only by rejecting these principles that you can truly perceive reality as it is, chaos.
And given that information you can expect a discordian calendar to exist. This calendar defines years (YOLD = year of our lady of discord) and seasons and days. And holydays:
Chaosflux is a Holyday of the season of Chaos. It is celebrated on Chaos 50 (Discordian calendar) or February 19 (Gregorian calendar).
Very little is known about this holyday. What we do know is pretty much made up as we go along.
We use the term “smart home” lightly these days. It has become a term of marketing and phantastic stories.
Considering how readily available lots of different sensors, actors and personal-assistants are these days one would think that most people would start to expect more from the marketing “smart-home”.
I believe that the smart is to be found in the small and simple. There are a lot of small things that actually make something feel smart without it actually being smart about anything.
Being smart is something not achieved yet – not even by a far stretch of the sense of the word. So let’s put that to the sides of the discussion for now and move a simple thing in the middle of this article.
Have you ever had an argument about who should or should have cleared out the dishwasher after it’s finished?
We had.
So we outsourced the discussion and decision to a 3rd party. We made our house understand when the dishwasher starts and ends it’s task. And made it flip a coin.
There was already a power consumption monitoring in place for the dishwasher. Adding a hysteresis over that monitoring would yield a simple “starts running” / “stops running” state of the dishwasher.
Pictured above is said power consumption.
When the values enter the red area in the graph the dishwasher is considered to be running.
When it leaves that area the dishwasher is considered finished/not running
Now adding a bit of random coin-tossing by the computer and each time when the dishwasher is detected to have started work a message is sent out depending on the result of the coin-toss.
That message is published and automatically displayed on all active displays in the house (TVs/…) and sent as push notifications to all members that need to be informed of this conclusive and important decision.
In short:
Everyone gets a push notification who is going to clear out the dishwasher based upon a coin-toss by a computer every time the dishwasher starts.
The base of all of this is a Node-RED flow that that uses the power consumption MQTT messages as an input and outputs back to MQTT as well as pushes out the push notifications to phones, screens and watches.
Additionally it creates a calendar entry with the start-finish time of the dishwasher run as well as the total energy consumption for this run.
The flow works like this: on the right the message enters the flow from MQTT. The message itself contains just the value of the power consumed at this very moment. In this case consumed the dishwasher.
The power consumption is updated regularly, every couple of seconds this way. So every couple of seconds this flow runs and gets an updated value of
Next a hysteresis is applied. In simple terms this means: when the value goes above a certain threshold the dishwasher is considered to be running. When it goes below a certain threshold then it is considered finished.
When the dishwasher changed it’s state to “running” the flow will generate a random number between 0 and 1. This give a 50:50 chance for either Steffi or Daniel be the chosen one to clear out the dishwasher for this run. This message is sent out as push notification to all phones, watches and TVs.
When the dishwasher finishes it’s run the total energy consumption is taken and sent out as the “I am done message”. Also this information is added to the calendar. Voilá.
A calendar? Why a calendar you may ask. Oh well there are several reasons. Think of calendars as another way to interact with the house. All sorts of things happen on a timeline. A calendar is only a visual aid to interact with timelines.
May it be a home appliance running and motion being sensed for your home alarm system. All of that can be displayed in a calendar and thus automatically sync to all your devices capable to display this calendar.
And if you start adding entries to a calendar that the house uses to know what to do next… how about putting light on-off times into an actual calendar right on your phone instead of a complicated browser user interface like many of those marketing smart-homes want us to use?
As of early 2019 I’ve started to bring back my content output stream to this website/weblog.
So far I am feeling quite confident publishing content here and even with changing legislation I am doing my best to provide an as good as possible experience to each visitor.
As of End-of-February 2018 this blog is being provided securly encrypted with SSL certificates from Let’s Encrypt.
So security is one thing. Data privacy and safety another.
Apart from the commenting and searching there’s no functionality provided to enter/store data.
comments
When you enter a comment the assumption is that this is your call for consent. Your comment will be stored. With the information you’ve entered and can see on-screen as well as the IP address you’ve used. Akismet then is used to provide Anti-comment-SPAM functionality – so part of this data is transferred over to Akismet for processing. After moderation the comment is visible for everyone under the article you’ve created it.
cookies and browser local storage
No cookies are used or required by the website.
server logfiles
There are no logfiles. No access and no error logs. There is no tracking or analysis. There is no advertisting or monitoring. All I can see is an nginx and php process delivering websites. Your IP address is know to the server for as long as it takes to do his job of delivering the asset you asked for. Nothing gets stored on server side for your read requests.
content loading
No content is loaded from other domains or websites. Everything is hosted on my server. No data is exchanged with externals to bring you this website.
Apple has started to force developers that want to develop and publish on the MacOS and iOS platform to enable two-factor authentication.
Two-factor authentication (also known as 2FA) is a type, or subset, of multi-factor authentication. It is a method of confirming users’ claimed identities by using a combination of two different factors: 1) something they know, 2) something they have, or 3) something they are.
When I just got around enabling it for one of the apple accounts I’ve got there seems to be a much much higher security barrier in place already…
That’s probably some sort of zero-factor no-authentication. I guess. Anyway: Kudos to Apple for finally forcing people to minimum standards. Properly integrating the second factor will make this so much simpler for users. Apples ecosystem solution already is quite well integrated.
Have you switched all your daily used services to two-factor authentication yet?
Apparently yesterday somebody pushed the wrong button. Twice.
Like most countries Germany got a system in place to broadcast out warnings to the public in case of disasters or else.
And it proved to be quite useful in the past when it comes to the occasional storm or heavy snowfall/rain/lightning.
Seeing that they run a test and then again send out an apology to have run a test is puzzling and funny at the same time. Everyone has a “bad hair” day, right?
Ever since we had changed our daily diet we started to weigh everything we eat or cook. Like everything.
Quickly we found that those kitchen scale you can cheaply buy are either not offering the convenience we are looking for or regularly running out of power and need battery replacements.
As we already have all sorts of home automation in place anyway the idea was born to integrate en ESP8266 into two of those cheap scales and – while ripping out most of their electronics – base the new scale functionality on the load cells already in the cheap scale.
So one afternoon in January 2018 I sat down and put all the parts together:
After the hardware portion I sat down and programmed the firmware of the ESP8266. The simple idea: It should connect to wifi and to the house MQTT broker.
It would then send it’s measures into a /raw topic as well as receive commands (tare, calibration) over a /cmd topic.
Now the next step was to get the display of the measured weights sorted. The idea for this: write a web application that would connect to the MQTT brokers websocket and receive the stream of measurements. It would then add some additional logic like a “tare” button in the web interface as well as a list of recent measurements that can be stored for later use.
An additional automation would be that if the tare button is pressed and the weight is bigger than 10g the weight would automatically be added to the measurements list in the web app – no matter which of the tare buttons where used. The tare button in the web app or the physical button on the actual scale. Very practical!
Here’s a short demo of the logic, the scale and the web app in a video:
We are looking at our screens more and more time of the day and most of that time we are reading or writing text. Text needs to look pretty for our eyes not to get sore – apart from the obvious “being able to tell what letter that is” there is a big portion of personal taste and preference when it comes to the choice of the font.
If you ever traveled on a train or plane with good active noise cancellation headphones you might agree how much more pleasant the trip was with much less noise reaching your ears.
When I tried active noise cancellation for the first time I had that weird sensation as if the pressure around suddenly changed. Like being in a very fast elevator or going for a quick dive. It felt weird but luckily it went away and the aww of joy replaced it. Quietness. Bliss.
Now there seem to be people for whom that feeling won’t go away. They get headaches and cannot stand the feeling when using active noise cancellation.
I’ve never had any explanation to this phenomena – until now. I ran across an article on SoundStage describing that in fact the feeling is not caused by actual changes of pressure but…
According to the engineer, eardrum suck, while it feels like a quick change in pressure, is psychosomatic. “There’s no actual pressure change. It’s caused by a disruption in the balance of sound you’re used to hearing,” he explained.
Aha! The brain gets confused by signals reaching your ears that naturally would not exists. Those signals make no sense so the brain tries to make sense of it. And voilá something is sucking your ear drum!
In 2017 Texas Instruments had released a line of cheap industry grade LED projectors meant to be used in production lines and alike:
DLP® LightCrafter Display 2000 is an easy-to-use, plug-and-play evaluation platform for a wide array of ultra-mobile and ultra-portable display applications in consumer, wearables, industrial, medical, and Internet of Things (IoT) markets. The evaluation module (EVM) features the DLP2000 chipset comprised of the DLP2000.2 nHD DMD, DLPC2607 display controller and DLPA1000 PMIC/LED driver. This EVM comes equipped with a production ready optical engine and processor interface supporting 8/16/24-bit RGB parallel video interface in a small-form factor.
After I had learned about the existence of those small projectors I had to get a couple and try for myself. There would be so many immediate and potential applications in our house.
After having them delivered I did the first trial with just a breadboard and the Raspberry Pi 3.
The projector module has a native resolution of 640×360 – so not exactly high-pixel-density. And of course if the image is projected bigger the screen-door effect is quite noticeable. Also it’s not the brightest of images depending on the size. For the usual use-cases the brightness is definitely sufficient.
Downsides
too low brightness for large projection size – no daylight projection
low resolution is an issue for text and web content – it is not so much of an issue for moving pictures as you might think. Video playback is well usable.
flimsy optics that you need to set focus manually – works but there is no automatic focus or alike.
Upsides
very low powered – 2.5A/5V USB power supply is sufficient for Pi Zero + Projector on full brightness (30 lumen)
low brightness is not always bad – one of our specific use cases requires an as dim as possible image with fine grain control of thr brightness which this projector has.
extremely small footprint / size allows to integrate this device into places you would not have thought of.
almost fully silent operation – the only moving part that makes a sound is the color wheel inside the DLP module. You have to put your ear right onto it to hear anything.
passive cooling sufficient – even at full brightness an added heat sink is enough to dissipate the heat generated by the LED.
So what are these use cases that require such a projector you ask?
Night status display:
For the last 20+ years I am used to sleep with a “night playlist” running. So far a LED TV was used at the lowest brightness possible. Still it was pretty bright. The projector module allows to dim the brightness down to almost “moon brightness” and also allows to adjust the color balance towards the reds. This means: the perfect night projection is possible! And the power consumption is extremely low. A well watchable lowest brightness red-shifted image also means much lower temperatures on the projector module – it’s crazy how low powered, low temperature.
Season Window Projection:
Because the projector is small, low-powered and bright enough for back-lit projection we tried and succeeded with a Halloween window projection scene the last season.
It really looks funky from the outside – funky enough to have several people stop in front of the house and point fingers. All that while power consumption was really
House overall status projections:
When projecting information is that cheap and power efficient it really shines when used to display overall status information like house-alarm status, general switch maps, locations of family members and so on. I’ve left those to your imagination as these kind of status displays are more or less giving away a lot of personal information that isn’t well suited for the internet.
Many cars these days come with head up displays. These kind of displays are used to make information like the current speed appear “floating” over the street ahead right in your field of vision.
This has the clear advantage that the driver can stay focused on the street rather than looking away from the street and to the speedometer.
As practical as it seems these displays are not easy to build and seemingly not easy to design. Every time I came across one it’s built-in functionalities where limited in a way that I only can assume not a lot of thought had gone into what exactly would the driver like to see and how that would be displayed. There was always so much left to desire.
Apparently the technology behind these HUDs is at a point where it’s quite affordable to start playing with some ideas to retrofit a car with a more personal and likeable version.
So I started to take a look at what is available – smart phones have bright displays and I had never tried to see what happens when you try to utilize them to project information into the windshield. So I tried.
As you can see – bright enough, readable but hazy and not perfectly sharp. The reason is quite simple:
“In the special windshield normally used, the transparent plastic safety material sandwiched in between the two pieces of glass must have a slight and very precise wedge, so that the vehicle operator does not see a HUD double image.”
There are some retrofit adhesive film solutions available that claim to help with that. I have not tried any yet. To be honest: to my eye the difference is noticeable but not a deal-breaker.
So I’ve tried apps available. They work. But they do a lot of things different from how I would have expected or done them. They are bearable, but I think it could be done better.
tldr: I started prototyping away and made a list of things that need to be done about the existing HUD applications.
Here’s my list of what I want to achieve:
display orientation according to driving direction – I had expected all HUD applications to do this. They know the driving direction. They know how the device is oriented in space. They can tell which direction the windshield is. They know how to correctly turn the screen. They do not do that. None of them.
fonts and numbers – I cannot stand the numbers jumping around when they change up and down
speed steps interpolation – GPS only delivers a speed update every second or so. In this time speed might jump up and down by more than +1. The display has 60 fps and gyros to play with and interpolate… I want smooth number transitions.
have an “eco-meter” – using gyros the HUD would be able to display harsh accelleration and breaking. Maybe display a color-coded bar and whatever is measured is reflected in the bar going left or right…
speed-limit display – apparently this is a huge issue looking at the data availability. There seems to be open-street-map data and options to contribute. Maybe that can be added.
have a non-hud mode – non mirrored to use for example to set speed limits and contribute to OpenStreetMap this way!
automatically switch between HUD and non-HUD mode – because the device knows it’s orientation in space – if you pick it up from the dashboard and look into it, why not automatically switch?
speed zones color coding – change the color of the speed display depending on configurable speed regions. 0-80 is green, 80-130 is yellow, 130-250 is red.
turn display off when car stopped – if there’s nothing displayed or needs to be displayed, for example because the car stopped the display can be turned off completely on it’s own.
Navigation is of limited value as the only way I could think of adding value would be a serious AR solution that uses the whole windshield. Now I’ve got these small low-power projectors around… that get’s me thinking…
What would you want to have in such a HUD in your car?
I am at the stage of “trying to comprehend” the japanese spoken language.
I’ll be a happy camper if I would understand most of what is being said and could follow daylight normal conversations pointed towards me japanese. Like, you know, when trying to make a purchase or having to ask for that one bit of information.
For this, apart from excessive exposure to the spoken language, I am using some tools to help with reading to a small degree.
For those completely out of the loop:
Japanese has no genetic relationship with Chinese, but it makes extensive use of Chinese characters, or kanji (漢字), in its writing system, and a large portion of its vocabulary is borrowed from Chinese. Along with kanji, the Japanese writing system primarily uses two syllabic scripts, hiragana (ひらがな or 平仮名) and katakana (カタカナ or 片仮名). Latin script is used in a limited fashion, such as for imported acronyms, and the numeral system uses mostly Arabic numerals alongside traditional Chinese numerals.
I’ve written about the progressive web application functionalities provided by this blog. But I’ve missed to explain in all simplicity what it means for most of you trying to read.
A week ago I had written about another mechanical hard drive that was about to bite the dust in our houses elaborate set-up.
Not having time for a full-day-of-focus I postponed the upgrade to this saturday. With the agreement of the family as they are suffering through the maintenance period as well.
The upgrade would need cautious preparation in order to be doable in one sitting. And this was also meant to be some sort of disaster-recovery-drill. I would restore the house central docker and service infrastructure from scratch along this.
And this would need to happen:
all services, zfs pools, docker containers, configurations needed to be double checked for full backup – as this would be used to restore all (ZFS snapshots are just the bomb for these things!)
the main central docker server would have to go down
get a fresh Ubuntu 18.04 LTS set-up and booting from ZFS on a NVMe SSD (bios update(s)!, secure boot disabling, ahci enabling, m.2 instead of sata express switching…you get the idea)
get the network set-up in order: upgrading from Ubuntu 16.04 to 18.04 means ifupdown networking was replaced by netplan. Hurray! Not.
get docker-ce and docker-compose ready and set-up and all these funky networkings aligned – figure out in this that there are major issues with IPv6 in docker currently.
pull in the small number of still needed mechanical hard disks and import the ZFS pools
start the docker builds from the backup (one script \o/)
start the docker containers in their required order (one script \o/)
Apart from some hardware/bios related issues and the rather unexpected netplan introduction everything went fairly good. It just takes ages to see data copied.
Bandwidth was the only real issue with this disaster recovery. All building blocks seemed to fall into place and no unplanned measure had to be taken. The house systems went partially down at around 12:30 and were back up 10 hours later 22:00. Of course non-automated things like internet kept working and all switches were only manual push-buttons. So everything could be done still but with a lot less convenience.
All in all there are more than 40 vital docker container based services that get started one after the other and interconnect to deliver a full house home automation. With the added SSD performance this whole ship is much much more responsive to activities. And hopefully less prone to mechanical defects.
Backup and Disaster-Preparations showed to be practical and working well. There was no beat missed (except sensor measure values during the 10 hours downtime) and no data lost.
What could be done better: It could be much more straight forward when there were less dependencies on external repositories / docker-hub. Almost all issues that came up with containers where from the fact that the maintainers had just a day before introduced something that kept them from spinning up naturally. Bad luck. But that can be helped! There’s now a multi-page disaster-recovery-procedure document that will be used and updated in the future.
Oh and what speeds am I seeing? The promissed 3 Gbyte/s read and write speeds are real. It’s quite impressive to see 4-digit megabyte/s values in iotop frequently.
I almost forgot! During this exercise I had been in the server room less than 30 minutes. But I was on a warm and nice work-desk set-up I am using in the house as much as I can – and I will tell you about it in another article. But the major feature of this work-desk set-up is that it is (a) a standing desk and (b) has a treadmill under it. Yes. Treadmill.
You will get pictures of the set-up in that mentioned article, but since I had spent more than 10 hours walking on saturday doing the disaster recovery I want to give you a glimpse of what such a set-up means:
This project uses the same approach that I took for my ESP32 based indoor location tracking system (by tracking BLE signal strength). But this project came up with an actual user interface – NICE!
This large amount of spinning disks means that there are also failing drives that stop spinning once every while. Backblaze saw the need to take note about what hard-drive series fails more of less often and started to generate a yearly report on the reliability of these hard drives.
Yesterday they published their report for 2018 – if you got storage requirements or if you are in the market to purchase storage space for your operation – it probably is very helpful to take a look at the report.
We’ve got a couple of SONOS based multi-room-audio zones in our house and with the newest generation of SONOS speakers you can get Apple Airplay. Fancy!
But the older hardware does not support Apple Airplay due to it’s limiting hardware. This is too bad.
So once again Docker and OpenSource + Reverse-Engineering come to the rescue.
AirConnect is a small but fancy tool that bridges SONOS and Chromecast to Airplay effortlessly. Just start and be done.
It works a treat and all of a sudden all those SONOS zones become Airplay devices.
Last year I had started to create a calendar that would hold all the events and festivals (まつり / matsuri) in Japan – especially Tokyo – I can get ahold of.
Since it has become a custom in my family to spend several weeks several times a year in the Tokyo area this calendar is used and updated frequently.
Of course it is a calendar you can export, import and subscribe to with any iCal / ICS capable device at your disposal. And probably that means any device that has a calendar app or a browser.
Funny enough I did not take part. And sure enough I am getting all these nudges by services like Twitter… So. Ten years ago this blog existed and I started using Twitter. Apparently I will still use this website but changes a bit my general approach to services like Twitter.
And the next time I’ll explain why I am always awake quite early :-)
I had reported on my efforts to develop an indoor location tracking system previously. Back in 2017 when I started to work on this I only planned to utilize inexpensive EspressIf ESP32 SoCs to look for bluetooth beacons.
In the time between I figured that I could, and should, also utilize the multiple digital and analog input/output pins this specific SoC offers. And what better to utilize it with then a range of sensors that also now could feed their measurements into an MQTT feed along with the bluetooth details.
And there is a whole lot of sensors that I’ve added. On a breadboard it looks like this:
So what do we have here:
Motion sensor
Temperature sensor
Humidity sensor
Light sensor
Barometric pressure sensor
and of course an RGB LED to show a status
The software I’ve done already and after 3 weeks of extensive testing it seems that it’s stable. I will release this eventually later in the process.
I’ve also found plastic cases that fill fit this amount of sensory over the sensor cases I had already bought for the ESP32 alone. For now I’ll close this article with some pictures.
PWA simply put is a standardized way to add some context to websites and package them up so they behave as much like a native mobile application. A mobile application that you are used to install onto your phone or tablet most likely using an app store of some sort.
The aim of PWA is to provide a framework and tooling so that the website is able to provide features like push notifications, background updates, offline modes and so on.
Very neat. I’ve just today have enabled the PWA mode of this website, so you’re now free to add it to your home screen. But fear not: You won’t be pestered with push notifications or any background stuff taking place. It’s merely a more convenient optional shortcut.
I’ve had already added a couple of pictures to my instagram account – mainly while abroad. Pictures that I consider nice enough to be shared.
Of course my latest switch away from those public silos will include having those pictures posted mainly on this website and maybe as a side-note on those services as well.
To begin with I will have a separate page created that will host those pictures I consider nice enough to be shared.
A bit of feedback is in on the plan to revitalize this blog. Thanks for that!
I have spent some more time this weekend on getting everything a bit tidied up.
There is the archive of >3.000 posts that I plan to review and re-categorize.
There is the big number of comments that had been made in the past and that I need to come up with a plan on how to allow/disallow/deal with comments and discussions in general on this website.
There is also the design and template aspects of this website. I switched to a different template and started to adjust it so that it shall make access to the stream of posts as easy as possible. Until then you need to wait or contact me through other means. But contacting is another post for another time.
The last Ubuntu kernel update seemingly kicked two hard disks out of a ZFS raidz – sigh. With ZFS on Linux this poses an issue:
Two hard drives that previously where in this ZFS pool named “storagepool” where reassigned a completely different device-id by Linux. So /dev/sdd became /dev/sdf and so on.
ZFS uses a specific metadata structure to encode information about that hard drive and it’s relationship to storage pools. When Linux reassigned a name to the hard drive apparently some things got shaken up in ZFS’ internal structures and mappings.
The solution was these steps
export the ZFS storage pool (=taking it offline for access/turning it off)
use the zpool functionality “labelclear” to clear off the data partition table of the hard drives that got “unavailable” to the storage pool
import the ZFS storage pool back in (=taking it online for access)
using the replace functionality of zpool to replace the old drive name with the new drive name.
After poking around for about 2 hours the above strategy made the storage pool to start rebuilding (resilvering in ZFS speak). Well – learning something every day.
Bonus: I was not immediately informed of the DEGRADED state of the storage pool. That needs to change. A simple command now is run by cron-tab every hour.
I am currently in the process of reducing my presence on the usual social networks. Here is my reasoning and how I will do it.
Facebook, Twitter, Instagram and alike are seemingly at the peak of their popularity and more and more users get more and more concerned about how their data and privacy is handled by those social networks. So am I.
Now my main concern is not so much on the privacy side. I never published anything on a social network – private or public – that I would not be published or freely distributed/leak. But:
I have published content with the intention that it would be accessible to everyone now and in the future. The increasing risk is that those publishing platforms are going to fade away and thus will render the content I had published there inaccessible.
My preferred way of publishing content and making sure that it stays accessible is this website – my personal blog.
I am doing this since 2004. The exact year that Facebook was founded. And apparently this website and it’s content has a good chance of being available longer than the biggest social network at the present time.
So what does this mean? 3 basic implications:
I will become a “lurker” on the social networks. Now and in the future.
All comments and reactions I will make will be either directly in private or through my personal website publicly available and linkable.
As you can see: This is not about a cut or abstinence. I get information out of social networks, tweet message flows. But I do not put any trust in the longevity of both the platforms and the content published there.
The next steps for me will be a complete overaul of this website. Get everything up to current standards to streamline my publishing process.
Expect a lot of content and change – and: welcome to my blog!