Sunday, June 24, 2018

Updates and missing sidebar stuff...

Just a quick post as it's 3:20am and I've been working on this since 8:00pm last night!

Regular visitors will no doubt have noticed that my sidebar widgets have gone. The reason for this is quite simple. They write cookies without asking. As most people will realise, this is a no-no under GDPR.

Blogspot usefully provides a cookie that shows when a visitor agrees to the cookies usage banner and I've fixed up a fairly straightforward listener script that checks for the presence of this cookie. It works fine...   

The  idea is that where the widgets are I've put simple placeholder div tags and that once my listener verified a visitor has agreed to the use of cookies, to update the placeholder div tags with the third party provided scripts and thus load the Amazon and Twitter feeds.

The problem however is that it's not quite that easy....

Firstly, you can't just fudge it using document.getElementById and innerHTML to modify the div content, because browser security blocks this to prevent cross site scripting attacks. You have to do it properly using the DOM. 

This means using something along the lines of

var sc = document.createElement("script");
sc.setAttribute("src", "https://some");
sc.setAttribute("type", "text/javascript");
Which appends a script tag within the div itself. 

This pretty much works for most external scripts, and I've used this to good effect on several other sites....


With both the Amazon and Twitter feeds the script that is being called uses document.write to create content. This is bad... Bad because firstly it's a blocking method. As a synchronous write insertion it holds up the page load normally, but as we have now waited until after the page has loaded, and the visitor has agreed to cookies activating my listener script and kicking off the writing off the createElement script, the page has long ago loaded.... And document.write can no longer add to the current div... Doh!

What is needed is an asynchronous method of writing to the placeholder, but of course I have no control over the content of the third party script.

This makes life a lot more complicated!

So, at present I'm looking at using jQuery or an Ajax script to act as an asynchronous buffer for the third party script. This should solve the problem of insertion after page load is complete, but adds a lot of complexity.

I will do a full write up when this is finished, as I'm sure there will be a number of bloggers using Blogspot who are in the same position - having third party scripts that write cookies, but that they have no control over.

In the meantime, sidebar widgets will be off and the site may be a little odd while I run these tests.

As the BBC used to say: "Normal service will resume shortly, please do not adjust your sets" :)

More later...

Friday, June 22, 2018

Keeping older hardware alive and useful

If you are anything like me, you will have a few old PCs and laptops that are getting a bit long in the tooth, perhaps only 32bit architecture. But you can't quite bring yourself to dump them yet. Even though a modern windows install or even most of the recent mainstream Linux distros probably wont even boot, or would be painfully slow... 

... But all is not lost, there are still up-to-date, feature packed Linux distributions that can breath new life into your older systems.

Right now Im creating this blog post using one of my several aging Sony Vaio laptops that are more than a decade old.  Current spec is a 2GHz cpu (pentium 4 mobile) with 2Gigs of ram. It has a fairly respectable 1600x1200 resolution display and supports a second monitor to further expand the desktop - so is a useful machine for development work. Its only real limitation is being a P4.

Despite this handicap, this is how my desktop looks:

(you can click any of the images to see them full size)

A full featured GUI with transparency, shadowing and all the most up-to-date features and software packages found in a Linux distro.

How have I achieved this?  BunsenLabs Helium a lightweight debian based Linux distro, released in both 32bit and 64bit architecture. It quite happily installs on machines with as little as half a gig of ram and just a few gigs of disk space. Despite this, it is highly customisable, and very quick in use.

A few quick menu based tweaks gets you the attractive grey theme, edits the on-screen system info (conky) and the top bar (tint2) and adds the background image. I downloaded a mono-grey icon set (as I wanted a largely mono feel to the desktop) which simply has to be placed in /usr/share/icons/ and other than that, its pretty much stock. You can save all the config changes you make using the BLOB theme manager:

Which makes experimenting with settings and rolling back to an earlier look and feel when things go astray real easy.

So does it work...?

Well yes, I'm editing this using latest version of chrome (which also has a dark theme added)

And despite having a nice desktop with animations, blending and shadowing running, along with half a dozen chrome tabs open, and a VNC session to my development server and some music playing I'm still using only just over half  a gig of RAM and very little CPU.

I'm not going to bother writing up a full howto as there is plenty of info on the BunsenLabs site and in the help forums. In fact all the info you need is actually provided withing the base install itself as it has very comprehensive help menus. Being based on latest stable debian, there is a wealth of info available as well. You really cant go wrong with this!

Instead, just a few more screenshots:

Using Geany editor / IDE for HTML editing and other coding

Yes, that's the full version of GIMP with all the extras!

And that's a VNC remote desktop session running to my development server (which is also running Helium with only half a gig of RAM!)

As lightweight distros go, I really can't fault this. It is much lighter in footprint than Lubuntu which was my previous lightweight distro of choice. Blog followers will realise I'm not one to be easily swayed in my opinions when it comes to distros, but I'm sold.

If you need a lightweight distro, they dont come much lighter than Helium!

Sunday, April 15, 2018

Using JQuery and the IPData API to serve content based on locale

Have you ever wanted to be able to greet visitors to your blog or project website with a personalised greeting in their own language, or to offer content or advertising links based on the country that a visitor to your site is coming from?

If so, this short tutorial post might well help you along the way with that.

You will need to be able to edit the actual HTML content of your site or be able to paste in a and HTML/Javascript block - (which blogger supports either as a widget, or as a code edit via the main interface).

The code below is pretty much all you need to get started, though you will obviously need to edit it both as far as the countries codes and the actual content you want to serve goes.

<div id="UK" style="display:none"><h1>UK based content goes here</h1></div>
<div id="US" style="display:none"><h1>US based content goes here</h1></div>
<div id="FR" style="display:none"><h1>FR based content goes here</h1></div>
<div id="other" style="display:none"><h1>other language based content goes here</h1></div>

<script src="">

//load jquery which makes this much easier! 



    // put in an API call to to get the users location data

function(geodata) {
        // dump the result out to the console log for debug 

  // now we can make some choices (you can get a fulllist of country codes here :
  switch(geodata.country_code) {
    case "GB":
        //country code was == to "GB"
        console.log("I say old chap!");
        // toggle this visibility of the GB specific content div"block";
    case "US":
        console.log("Howdy partner!");
        // toggle this visibility of the US specific content div"block";
    case "FR":
        // toggle this visibility of the FR specific content div"block";

    // add more to suit if needed!
       // toggle this visibility of the catchall div 


Although the above is pretty self explanatory I will break down exactly what is happening a little:

Firstly create a div for each language block, with its ID set to match the expected 2 char country codes. The important part is to set the style as being   display:none   which means by default they are all hidden. When the script runs and pulls back a country code, the case switch block will change the visibility of one of your divs from  none to block and will thus make just that div visible.

How?  Well,  next we call the jquery API.   If you are not familiar with this API, it makes things like asynchronous  (background) calls to other sites and APIs so much easier. I did start out writing this test in pure javascript but quickly decided that the extra overhead of using jquery is worth it in terms of simplicity, and lets face it blogger loads so much other stuff that one more lib isnt going to break anything!

Once jquery is loaded we can then use its $.getJSON call to grab the output from

This grabs the visitors ipaddr (and other details) and returns it in a JSON formatted block.

if you visit in your browser you will see what the return is :
    "ip": "x.x.x.x  "city": "London",
    "region": "England",
    "region_code": "ENG",
    "country_name": "United Kingdom",
    "country_code": "GB",
    "continent_name": "Europe",
    "continent_code": "EU",
    "latitude": 51.5142,
    "longitude": -0.0931,
    "asn": "AS60339",
    "organisation": "BT Internet"; "postal": "EC2V",
    "currency": "GBP",
    "currency_symbol": "\u00a3",
    "calling_code": "44",
    "flag": "",
    "emoji_flag": "\ud83c\uddec\ud83c\udde7",
    "time_zone": "Europe/London",
    "utc_offset": "+0100",
    "is_eu": true,
    "suspicious_factors": {
        "is_tor": false

Essentially a list of parameters about your location - based on your public ip address, in a JSON format.  If you are not familiar with json have a look at  basically json is a language independent data format much like XML, which is very commonly used in web  systems to communicate data between systems using API calls. It obeys a machine and human readable structure meaning that you can look at its output and understand it, and more importantly javascript can parse it and extract data.

You get 1500 free api queries per day and there is no need to signup before using the api, so unless your site has a lot of traffic, this is pretty much fire and forget as far as the api goes.  If you do, you might consider writing out a cookie with the visitors country code when they first arrive, and check for its presence before calling the api for the geodata to save traffic.

(By the way,  $.getJSON('',       is not a typo - if you are looking for the closing bracket its down at the bottom of the script)

You are probably wondering how we get the return from that?

well the function call on the next line :  function(geodata){    collects the return from the api-call and assigns the raw json object to the variable geodata.

We dump that out to the console.log for debugging purposes. Your browser will usually offer you the ability to inspect the page and give you console log data as part of its output, but that is beyond the scope of this tutorial. If you need more info about this google is your friend :)

OK, so we assume at this point (and you might want to trap some errors if you are an advanced user and make no assumptions!) that geodata is now a JSON formatted  object. This means we can use simple object orientated code to get at the info that we need - ie: the country code

And to do this we assume that geodata.country_code will contain the two char international country code conforming to ISO_3166-1, and thus we can make a simple  case-switch decision tree using:


and provide a case for any countries we want to serve specific content for using:

case "GB":
        //country code was == to "GB"
        console.log("I say old chap!");
        // toggle this visibility of the GB specific content div"block";

for each country code. where the case statement matches the country code, the console log simply records a suitable remark to identify the country.

The important part is"block";

This changes the visibility of one of the divs lower down to block, meaning it suddenly becomes visible, whereas all its counterparts remain hidden.

If none of the case statements find a match, then the default scenario is launched by making the div with the "other" id visible. This allows you to have a catchall greeting/block/advert in place for visitors from parts of the world you just didn't figure would pay your site a visit.

and that's about it as far as the javascript goes.

That all sounds complicated, but really isn't

You can experiment with any of the key and value pairs eg: currency and GBP in the same way, but beware of the actual lon/lat info reported - Im certainly not in London right now, though my ISPs core switch is. If you need finer grain geolocation info then you can look at the HTML5 geolocation system, but be aware that unlike the above script which is silent, that asking for more high resolution geo loc info via html5 will bring up a prompt in the browser for a permission.

I hope that's a help getting started with your location specific greetings  I will leave you with   and the request "do no evil". This script is not to discriminate, but to make more inclusive. I have travelled through many counties in the world now, and been welcomed in all. The sooner we realise we are all citizens of earth and that artificially created boundaries and differences  are meaningless, the better for all.

Wednesday, September 18, 2013

A Nostradamus moment?

Just a couple of days back I wrote a blog post entitled MOOCs? where I also drew reference to a blog post I wrote a couple of years back where I discussed my worries that the current ICT curriculum in UK schools is not a substitute for learning Computer Science : ICT is not CS!  Which seems to have almost been prophetic when two directly related posts cropped up on the web yesterday.

The first web announcement was spotted on a website dedicated to development of software, games, and interactive media, which highlights the new government outlines for the new Computer Science course which is to replace ICT Which very much echos my thoughts and hopefully addresses the shortcomings of ICT as it is at present. The new curriculum  aims to deliver a structured introduction to computer science starting at KS1 and adding new concepts as well as building upon principles through to KS4.

           "The government has outlined the new curriculum for computer science in schools.

A program has been drawn up for Key stages one through four, which will mean students will begin learning computer science from early primary school and then throughout their education.

The statutory guidance states the aim of the national curriculum for computing is to ensure students can understand and apply the fundamental principles and concepts of computer science, including abstraction, logic, algorithms and data representation.

Pupils will also be taught to analyse problems in computational terms with practical experience of writing computer programs to solve them, to evaluate and apply information technology analytically to solve problems, and also ensure pupils are responsible, competent, confident and creative users of such information and communication technology.

The publication stated that such a “high quality education equips pupils to use computational thinking and creativity to understand and change the world”.
The new GSCE subject in computing will replace ICT from September 2014."

Reading through the key stage subject contents listed, this certainly looks like an improvement on the existing ICT syllabuses and will be welcomed I feel by both pupils and some educators.... 

....But there lies one of the problems...

With a course of this type, the educator - the person delivering the material will really make or break the learning experience that children have at all levels. The GCSE Computer Science syllabus really needs someone a little more "geeky" to deliver it well, someone who has a good understanding of the concepts to be delivered and an active interest in the area, as well as general teaching skills and the enthusiasm for the subject that can be passed to the pupils. I have no doubt that some - possibly many existing teachers of ICT will be more than capable of meeting the demands... But I am equally sure that many may struggle. Which sort of brings me round to my point...  How can these complex subjects be brought home to pupils when the teachers may not be fully up to speed themselves on the subject?    - MOOCs may be the answer... High quality interactive online learning material produced by the examining entities themselves. ??

...And this isnt perhaps out of the question...

The second web announcement yesterday is that Cambridge University Press announce that their GCSE Computing MOOC goes live at the end of the month (September 30th). This is a OCR accredited GCSE in computing course thats available and supported online right now, based around learning using the Raspberry PI platform. 

             "It's a GCSE, but not as you know it... Computing rules the world, or at least a large part of it. Cambridge GCSE Computing Online will provide free and open access to OCR’s GCSE in Computing, supported by resources from the Raspberry Pi Foundation and Cambridge University Press. Together we’re busy creating a ground-breaking site to help you make sense of the technologies and opportunities this amazing vehicle offers in industry, education and every aspect of our daily lives. Through a mixture of videos, animations and interactive exercises, the content is being designed to challenge and inspire you. We know that studying Computing is about using creativity and problem-solving to unlock opportunities all around you, inside the classroom and far beyond it."

This will be Freely available  (Free as in beer to use the linux analogy) to schools to help deliver this qualification and will provide a valuable resource, but more crucially is also available to students already in education worried that they have been disadvantaged by the current ICT curriculum. But they would have make their own arrangements to sit the exam - and it is important to stress that :  

             "No OCR GCSE Computing certificate will be available direct through the Cambridge GCSE Computing Online website but the content will help students prepare for the exam." 

So individuals interested in this qualification would need to see if a local school or college was  able to host the final examination.  Im sure that when this takes off a number of local groups will emerge, and due to the Raspberry Pi involvement it will be a hot topic in local Hackerspaces and groups, attracting interest from enthusiasts so finding enough people in a geographic area to make it worth while putting on an exam, or perhaps travelling to where one is running should not be out of the question.

In my opinion this has been a looooong time coming, and will be too late for some, but at least now there is light at the end of the tunnel for those students who are frustrated in their efforts to learn "Computing" rather than having "ICT" thrust upon them.

Sunday, September 15, 2013


Nearly two years back I wrote a blog post entitled ICT is not CS! where I wrote about some of the reasons why I feel that the UK (and some other countries too) are falling behind with education in computer science. An article which caused  discussion in some circles, and has indirectly lead to several projects that I have heard about where others with similar worries have stepped in to run various projects with schools in their own areas.  Thanks to everyone who gave feedback, and especially to those who went on to do something positive.

Thankfully the situation in the UK is improving now, despite resistance and scepticism from some. There are a number of initiatives that have featured in the national press which seem to be slowly filtering into place now. There are also a growing number of technology education based events throughout the country now, though some of them take a little finding out about unless you follow the right news feeds on twitter and facebook. But they are out there!

Sadly though, its still not true today throughout the world. There are still so many places where Science and technology education is scarce or non-existent...  But thanks to widespread availability of the internet, this may not always be the case. Even if there is no local source of science and technology education there have been for a year or so now a wide range of on-line education systems developed by some of the worlds leading universities delivered and  graded for free simply to benefit the world of education. I refer of course to MOOCs - (Massive Open Online Courses).

Today I read this:  which is what has prompted this post...  (credit to adafruit for the orignial tweet which drew it to my attention)

It describes the achievements of Battushig Myanganbayar from Mongolia, a country where "a third of the population is nomadic, living in round white felt tents called gers on the vast steppe". At the age of 15 "became one of 340 students out of 150,000 to earn a perfect score in Circuits and Electronics, a sophomore-level class at M.I.T. and the first Massive Open Online Course, or MOOC — a college course filmed and broadcast free or nearly free to anyone with an Internet connection — offered by the university."   The article also goes on to descibe how "Battushig’s success also showed that schools could use MOOCs to find exceptional students all over the globe. After the course, Kim and Zurgaanjin suggested that Battushig apply to M.I.T., and he has just started his freshman year — one of 88 international students in a freshman class of 1,116. Stuart Schmill, the dean of admissions, said Battushig’s perfect score proved that he could handle the work."

There can be little doubting therefore the value of these courses not only in developing countries, but the world over.

This has however raise some interesting questions about the future of education in some circles... Does this spell the beginning of the end of traditional education as some have suggested? Is it a "fad"?   Somehow I doubt that...  We have had the "Open University" Here in the UK for a number of years (albeit not for free), yet still see record attendance figures at our colleges and universities still.  I feel I should add that had it not been for the OU TV programs on the BBC back in the 70's and 80's I almost certainly would not have the interests in computing, science and technology I have now.  I tend to view these MOOCs courses in the same way - as a source of inspiration for the scientists and engineers of tomorrow.

A handful of UK colleges and universities are now offering a limited range of MOOCs, but  I think its time a lot more looked towards the production of their own MOOCs as not only a public service, but as a way of finding the brightest and best students who may otherwise not have found a way to their doors. And more importantly aiming them not only at school leavers - but at those in secondary education too... Especially if those courses had national accreditation comparable to the courses currently offered in the national curriculum.

It really is time the everyone started making education cheaper, more accessible and more meaningful to the future aspirations of our children - MOOCs may well be the means to do that.


Monday, September 02, 2013

Time to look at a balloon project again?

Just a quick blog post from my phone, more as a "note to self" as a future project than a full write up, but perhaps it may also serve as a starting point for others who follow my blog to begin their own research...

I've looked at and dismissed a high altitude balloon project several times in the past. There have been a number of reasons, but not least of these has been the costs involved - the latex balloon, large volume of helium required to fill it and the risk of losing an expensive payload all make for an expensive flight.
However a few news posts and blog entries on the web recently have found their way to my attention, suggesting that there is a lower cost entry route to this fascinating area of research.

I refer of course to "pico balloon projects".

Unlike the more frequently reported "high altitude" flights which carry complex payloads, usually equipped with cameras to "near space" heights of around 30km using a meteorological balloon and parachute recovery system, Pico balloon projects carry a very lightweight payload - typically just a few tens of grams - and are lifted using inexpensive foil "party" balloons (£3.95!!). They make a maximum altitude of just a few km, but can under ideal conditions make lengthy flights.

The weight restriction on the payload is governed by the very limited lift available from the balloon itself, but should not be viewed too negatively as with a little ingenuity a lot of technology can be packed into a 50g mass. It does also mean that the overall cost of the payload is lower, an advantage if recovery is uncertain.

From the little bit of research I've done, it seems there are some "off the shelf" boards available (pcb designs and firmware are open source). Personally, I would probably want to develop my own flight computer - probably using an arduino as the test platform, then building a cut down version using just the required components only (which looks to be the way others have gone too).
Bellow are a very few quick links to get you started...

edit (I forgot this one!) : (hardware)  (balloons)  (general info - not pico specific)

Friday, October 26, 2012


I had a spare hour or so today while I waited for a server to finish updating, so took the opportunity to assemble the Rockmite 20 which arrived from Small Wonder Labs in the USA yesterday.

Compliments to David Benson for this great little TRX. I can confirm it worked at first power up, delivering around 250mW into 50r. Signals heard but not attempted to work anyone yet.

Now for the next phase: a 12v lipo battery pack, and casing it up with a built in paddle key in the smallest possible box. 

Im aiming for something like :  But with a built in 2Ah battery with external charge socket.

Ultra portable qrp on 20m :)