Raspberry Pi as syslog server

I’ve been meaning for some time to add a Raspberry Pi to my lab environment as a syslog server and finally got around to it today.

I have a couple of Pis but when I went looking this morning I realized that I had a problem with the SD card on one which led to a small voyage of discovery.

To re-flash the SD card I chose the Raspbian Stretch Lite download, here.

And then re-flashed the card using balenaEtcher per instructions, here.

I operate the Pis in headless mode on a wired network and in order to get them to come up with SSH enabled I had to add an empty file ssh.txt to the boot partition on the SD card.

Once online I used a network scanner to find the dhcp served addressed and then set a static address by editing the configuration file.

sudo nano /etc/dhcpcd.conf

interface eth0

static ip_address=192.168.1.2/24
static routers=192.168.1.1
static domain_name_servers=192.168.1.1

More on the details of the syslog server itself in a future post.

Amazon Alexa

I’ve been a dedicated Google Home user for some period of time but I wanted to see the other side so when Echo Dot went on sale a few weeks ago I picked one up.

As with Google Home one of the first things I wanted to find, and enable, was an audible acknowledgment. The trigger word (phrase) doesn’t always get a response, depending on local conditions, and I’d rather know that I’ve been heard before talking into the ether.

Instructions for doing so, can be found here:

How to Make Your Amazon Echo Play a Sound When You Say “Alexa”

As regards Alexa, the jury is still out, but on balance Google Home seems to be a much more capable agent in terms of recognizing the intent of my queries and directions.

For the most part, however, I use both platforms for music selection. Google Play Music seems to be a more extensive music library although I don’t know that for a fact. I was disappointed to find that Alexa couldn’t query my local music library, indexed via Sonos, but since Google Home support has yet to be ported to Sonos so I’m no further behind. I use an audio chromecast in my Play:5 audio port to drive my Sonos enviroment with Google Home.

I also, recently, installed 3 Google Home Minis and an Onkyo amplifier, with embedded audio chromecast for a client. I’m impressed with the latest revisions to the Google Home app and how it manages multi-user, multi-home environments. More on that later.

Update: The missing piece of the puzzle. On Amazon Music, as a Prime member, I get access to “two million songs ad-free and on-demand, while Amazon Music Unlimited ($7.99/month) expands the library to tens of millions of songs and lets you download them for offline listening on any device.”

Artificial Intelligence/Machine Learning Articles

Artifical Intelligence Links

I took @AndrewYNG’s Machine Learning course in the winter of 2015, and I’ve retained a keen interest, since then, in all things ML and AI. I’ve decided to create this post so I have a place to aggregate interesting links to related articles.  Will back fill as I am able.

Nautilus – Why Robot Brains Need Symbols, Dec 6, 2018

AlphaZero: Shedding new light on the grand games of chess, shogi and Go

NYTimes – The Human Brain is a Time Traveler  Nov 15, 2018

The Genius Neuroscientist Who Might Hold The Key to True AI  Nov 13, 2018

Apple and Its Rivals Bet Their Futures on These Men’s Dreams  May 17, 2018

The Downsides to Deep Learning  Feb 2, 2018

Meet the Man Google Hired to Make AI a Reality  Jan 16, 2014

Using MS-SQL server with pHp on Linux

I’m a big fan of Microsoft’s SQL Server database but I also do a lot of coding in php, and Linux is my favourite platform for that. So, how best to mesh those two worlds.

The best step-by-step instruction guide that I have found is this one, below:

http://www.unixodbc.org/doc/FreeTDS.html

I’ve used this instruction set on numerous occasions and it is still possible to get yourself into an unworkable state, however, persistence, and attention to detail, always pays off.

Note: I  have run MS SQL server on Linux since the Linux preview was released in 2016.

Microsoft and FIDO2 support

A couple of interesting articles about Microsoft and FIDO2 out yesterday, and today.

The Verge – You can now sign into a Microsoft Account without a password using a security key

The Register – Microsoft: You looking at me funny? Oh, you just want to sign in

I’m not sure how I feel about this yet, but will continue to monitor and post feedback here.

I’ve used a yubikey to secure my Gmail account for several years now, and I also use one to secure my password vault (more on that later).

As the Register article indicates it could become a challenge for large organizations when users start losing their keys. To be continued …

Audio Chromecast

I’ve been a big fan of Google’s Chromecast since the beginning and when they released their Chromecast Audio product I, of course, had to take a look. We now own three of them, one connected to the audio port on our Sonos Play 5, and two on Bluetooth speakers, each with its own audio port. Using Google’s Home app you can easily create Groups of speakers to cast to. The Sonos environment can now be grouped in with non-Sonos speakers and within Sonos the Play 5 audio port can be grouped with other Sonos speakers, such as the Play 1. And then, of course, we got a Google Home Mini this year so we can now verbally request music to be played to both the Sonos and non-Sonos environment. We subscribe to Google Play Music and a nice feature here is that you can request Google Home to play a particular track and it will automatically put together a playlist of related tracks and carry on indefinitely. A must for music lovers.

Ping with timestamp

During a troubleshooting exercise a couple of weeks ago there was a need to log the pings from two different sources for later comparison. Of course this meant having a timestamp associated with each.

A quick search of Stackoverflow turned up the following thread:

Ping with timestamp

I think my favoured solution is the one that uses Powershell, as per below.

Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.

PS C:\

ping.exe -t twitter.com|Foreach{"{0} - {1}" -f (Get-Date),$_} > test.txt

In a second Powershell window you can also tail the output, thusly:

Get-Content filenamehere -Wait -Tail 30

The Untold Story of NotPetya, the Most Devastating Cyberattack in History

I finally got around to reading the excellent Wired article on NotPetya [Aug 2018], The Untold Story of NotPetya, the Most Devastating Cyberattack in History 

One of the items that stood out for me, relative to Maersk’s infrastructure was this one :

Early in the operation, the IT staffers rebuilding Maersk’s network came to a sickening realization. They had located backups of almost all of Maersk’s individual servers, dating from between three and seven days prior to NotPetya’s onset. But no one could find a backup for one crucial layer of the company’s network: its domain controllers, the servers that function as a detailed map of Maersk’s network and set the basic rules that determine which users are allowed access to which systems.

Maersk’s 150 or so domain controllers were programmed to sync their data with one another, so that, in theory, any of them could function as a backup for all the others. But that decentralized backup strategy hadn’t accounted for one scenario: where every domain controller is wiped simultaneously. “If we can’t recover our domain controllers,” a Maersk IT staffer remembers thinking, “we can’t recover anything.”

I can remember having this conversation with Microsoft in 2003 — how do we backup and recover Active Directory? They looked at me like I had two heads. “Why would you want to do that, it’s replicated to multiple DCs?”

Over time, that view has obviously changed. A couple of links for further reading:

AD DS Backup and Recovery Step-by-Step Guide
AD Reading: Active Directory Backup and Disaster Recovery

The article took me back to the days of Melissa, Code Red , Nimbda, and     SQL Slammer, although those still remain much more innocent times.

 

Create a table, dynamically, in MS-SQL.

I’m currently working a project that requires reading an MS Excel spreadsheet into a database table so I needed a stored procedure in MS SQL to create a table dynamically given the database name, table name and number of columns as an input. Re-using some prior work.

Code below:

USE [database name]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Create date: 2016-01-05
-- Description:	Create Element Tables.
-- =============================================
CREATE PROCEDURE [dbo].[CreateElementTables]
 @Database varchar(255),
 @TableName varchar(255),
 @NumColumns int
AS
BEGIN
	SET NOCOUNT ON;

    DECLARE @sqltext varchar(max)
	DECLARE @x int = 0

	SET @sqltext = 'CREATE TABLE ' + @Database + '.[dbo].' + @TableName + '([RowNumber][int] IDENTITY(1,1) PRIMARY KEY,'

	WHILE @x < @NumColumns
	BEGIN
	 SET @x = @x + 1
	 SET @sqltext = @sqltext + '[C' + cast(@x as varchar(2)) + '] [varchar](max) NULL,'
	END

	SET @sqltext = @sqltext + ') ON [PRIMARY]'
	SET @sqltext = REPLACE(@sqltext,'NULL,)','NULL)')

	print @sqltext
	exec(@sqltext)
END

Send email via pHp

For a number of years I’ve used @adamGreenPress’s twitter enagement framework for interacting with my twitter community. I’ve found it useful for generating a summary of a tweets that have been sent while I’ve been away from the platform. It’s based on pHp on the front-end, running on CentOS/Apache, and the data store is mySQL.

I use a pHp script, running on a schedule, to send summaries to my GMail account.

The script is as follows:

require('config.php');
require('db_lib.php');
$db = new db();

$query = "select * from engagement.tweets where user_id in (select user_id from engagement.leaders where screen_name like '%nvestor%') order by created_at desc LIMIT 20";
$result = $db->select_array($query);

$email = 'example@domain.com';
$to = $email; //writing mail to the user
$subject = "Globe Investor";

$message = "<html>";
$message = $message."<table>";
$message = $message."<tr><td>";
foreach($result as $r) {
  $message = $message."<tr>".$r['tweet_text'].",[".$r['created_at']."]"."</tr>";
}
$message = $message."</table>";
$message = $message."</html>";

$from = "example@domain.com";
// To send HTML mail, the Content-type header must be set
$headers  = 'MIME-Version: 1.0' . "\r\n";
$headers .= 'Content-type: text/html; charset=iso-8859-1' . "\r\n";
// Additional headers
$headers .= 'From: Team <example@domain.com>' . "\r\n";

if(mail($to,$subject,$message,$headers))
{
        echo "0";// mail sent Successfully.
}
else
{
        echo "1";
}

The connection string for the database is in db_lib.php, and I issue the select statement to retrieve the desired tweets (in this case from @GlobeInvestor) from the database at line 5.

To send to Gmail I use MSMTP configured per How To Use Gmail or Yahoo with PHP mail() Function at Digital Ocean.