<![CDATA[Chris Brown]]>https://blog.zirafon.org/https://blog.zirafon.org/favicon.pngChris Brownhttps://blog.zirafon.org/Ghost 2.14Mon, 18 Feb 2019 23:16:14 GMT60<![CDATA[AWS Summit 2018]]>https://blog.zirafon.org/aws-summit-2018/5b8c34071c30e311fc2499c5Fri, 11 May 2018 08:40:40 GMT

As Gavin Jackson, AWS managing director for UK and Ireland said in his keynote, "The Summit is AWS's chance to meet and thank the builders that use the platform in person".

Builders was later defined as pretty much anyone from a data scientist to a marketing manager, but for me the focus was more around the developers and showcasing the tools AWS provide and how to use them.

The event is often used to announce new products but there was nothing of interest this year. Although hilariously, Gavin delivered the "today were happy to announce" line and presented some tool that streamlines enterprise software contracts which passed by without an expected applause.

Test. Please delete!

Highlights

I've been using AWS intensively for a number of years now so there wasn't an awful lot of sessions and products that were new to me. However these are my headlines from the Summit;

  • We still have a major gender in-balance in our industry
  • Serverless is what most people are interested in
  • There is a huge arms race in the big data space
  • Despite EKS still being in preview, EKS and fargate looks to be a winning combination

Any gathering of people in our industry usually demonstrates the huge gender disparity we have, but I felt the summit had a larger in-balance than any conference i've been to before. The picture below was one of many taken that sadly demonstrated this.

AWS Summit 2018

Serverless popularity

Despite it being around for a number of years now, any breakout session that had serverless in the title was always standing room only. Examples include, how to build an SPA with cloud formation and how to authenticate serverless products using Cognito and IAM roles.

AWS Summit 2018

The most interesting serverless session for me, was how David Edwards, Solutions Architect at River Island, demonstrated how they changed their core order flow from a platform with difficulties to innovate, to an entirely serverless solution. Mainly using a combination of AWS Kinesis, Lambdas and step functions.

AWS Summit 2018

Giorgio Bonfiglio put together a great slide deck, detailing the infrastructure required to go from 1 user, all the way to 10m+. He ended up with event driven architecture using Kinesis and Lambda's whilst using ECS, S3 and Cloud front for content delivery.

Big data is big business

Walking around the exhibition hall whilst TShirt, sock and fidget spinner shopping, it was apparent how many players are in the big data market. I chatted to a couple and they all have their own ideas, some even saying data warehousing is dead. Although Mike Ferguson, managing director from intelligent business strategies managed to convince me otherwise.

Although AWS has a clear market lead over the other cloud provides, neither has a product that seems to have taken any sort of ascendancy in the big data space. But I very much feel this race has only just began.

Saying that I love the simplicity that the combination of S3, Athena and Glue brings to the table. With the difficulty of finding data scientists with phython experience, this for me is really compelling.

Fargate

The summit was the first time I had seen AWS Fargate in action as Abby Fuller showed us how fargate can be used to manage the deployment of your containers. Effectively meaning there is even less to manage. I think this should be a feature by default for EKS (if it ever comes out of preview), and if you want more control then revert to the EC2 launch type to define your specific AMI requirements.

The app

The app had its flaws, as the reviews on the play store prove. However, a feature I would love to see next year is a QR scanner built into the app. Each person had a code round their neck and it would have been a great alternative to swapping business cards.

Summit in Summary

The summit is free and focused on presenting and educating latest AWS products to developers. Its really well attended with AWS partners and a great place to meet interesting people trying to solve interesting problems. If you can make it next year, I thoroughly recommend it.

Links

https://www.twitch.tv/aws/videos/all
https://aws.amazon.com/summits/london/
https://github.com/richarvey/bl_docker_to_production_ecs
#AWSSummit

]]>
<![CDATA[HTTPS on github pages with a custom domain]]>Since June 2016 setting up HTTPS on github pages has always been easy. However they have never supported custom domains. Here is how I resolved this problem.

Encryption is becoming the standard for the entire web, for many obvious reasons -- to the point that all browsers will require HTTP/

]]>
https://blog.zirafon.org/https-on-github-pages/5b8c34071c30e311fc2499c4Mon, 30 Apr 2018 21:25:47 GMTSince June 2016 setting up HTTPS on github pages has always been easy. However they have never supported custom domains. Here is how I resolved this problem.

Encryption is becoming the standard for the entire web, for many obvious reasons -- to the point that all browsers will require HTTP/2 requests to be encrypted and they will flag non secure sites with a (i) warning symbol.

Personally I wanted a nice green padlock and to take advantage of the performance improvements of HTTP/2 multiplexing and service worker support that requires being served over HTTPS. For more information on HTTP/2 I really enjoyed Ana Balica's talk at NDC this year go check it out here.

How I did it

  1. Make sure you have a registered domain name and have set it up appropriately in your github page repo.
  2. Sign up for CloudFlare and create an account for your domain. Instructions to how to do that are here.
  3. Determine your domain registrar and login to your registrar account.
  4. Update your nameserver records to the Cloudflare nameservers.
  5. From the CloudFlare settings for that domain, enable HTTPS/SSL or set SSL to Full depending upon the version of the portal you are using.
  6. Thats it! yoursite.github.io should now redirect to a secure yoursite dot com.

Cloudflare offers loads of great features as a content delivery network and its free! Please check it out here.

Thanks for reading, any questions please ask below.

]]>
<![CDATA[Out with the old]]>I've previously blogged about how I setup my ghost blog but today I have finally updated it to v1, previously from 0.8 and specifically up to v1.22.4.

My previous site, which can be viewed here used my custom ghost theme but I MUCH prefer the new casper

]]>
https://blog.zirafon.org/out-with-the-old/5b8c34071c30e311fc2499c3Mon, 30 Apr 2018 10:48:48 GMTI've previously blogged about how I setup my ghost blog but today I have finally updated it to v1, previously from 0.8 and specifically up to v1.22.4.

My previous site, which can be viewed here used my custom ghost theme but I MUCH prefer the new casper theme, so for now im going to stick with it.

]]>
<![CDATA[Working effectively with log files]]>Working with logs in production is made really easy with solutions like the ELK stack, but working with logs locally can be really cumbersome.

For example, something like the configuration below is very typical for local development;

<?xml version="1.0" ?>
<nlog xmlns="http:
]]>
https://blog.zirafon.org/working-effectively-with-log-files/5b8c34071c30e311fc2499c2Tue, 21 Feb 2017 22:57:14 GMTWorking with logs in production is made really easy with solutions like the ELK stack, but working with logs locally can be really cumbersome.

For example, something like the configuration below is very typical for local development;

<?xml version="1.0" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

    <targets>
        <target name="file" xsi:type="File"
            layout="${longdate} ${logger} ${message}" 
            fileName="${basedir}/logs/logfile.log" 
            keepFileOpen="false"
            encoding="iso-8859-2" />
    </targets>

    <rules>
        <logger name="*" minlevel="Debug" writeTo="file" />
    </rules>
</nlog>

Typically we double click on the file, close, refresh and repeat.

Well below are a couple of really simple but effective solutions in working with log files.

1. Windows explorer preview pane

  1. Open windows explorer.
  2. View -> Preview Pane
  3. Select the log file.

Great thing is the preview pane refreshes the output everytime the file changes. If the preview pane doesnt show you the log file content its typically because the file extension isn't mapped to a content type.

Windows Registry Editor Version 5.00

[HKEY_CLASSES_ROOT\.log]
@="txtfile"
"Content Type"="text/plain"
"PerceivedType"="text"

Simply copy the above and save the file using the .reg file extension. Double click and the key will be added to the registry settings. Try highlighting the file again.

Only problem is the preview pane doesnt show the tail. But that's fine as typically in local development I archive and overwrite the log. In some cases this isnt ideal obviously.

2. Bash console

$ tail -n 0 -f logfile.log

logfile tail example

3. Notepad++

There are other text editors but notepad++ works best for me. However out of the box, after a change to the logfile you'l be presented with the following caption.

reload notepadd++

To enable silent reload of the log file. Settings -> Preferences -> MISC -> Update silently

NB. Notepad++ v6.9.2 introduces the tail command which scrolls to the EOF.

Click on eye icon on toolbar or use menu View->Monitoring (tail -f) to activate/disactivate this command


An honorable mention to the otroslogviewer project, but this is more of a power user tool and I have little use for it for local development.

Thanks for reading. Please share your thoughts and alternative approaches.

]]>
<![CDATA[Introduction to Web Components]]>Web components were first introduced, I believe in 2011 by Alex Russel. The idea was clear and remains so today; create reusable UI widgets that can be shared across web applications.

This is the 2nd time i've looked at web components as a technical solution to a problem, the first

]]>
https://blog.zirafon.org/introduction-to-web-components/5b8c34071c30e311fc2499c1Thu, 09 Feb 2017 21:12:31 GMTWeb components were first introduced, I believe in 2011 by Alex Russel. The idea was clear and remains so today; create reusable UI widgets that can be shared across web applications.

This is the 2nd time i've looked at web components as a technical solution to a problem, the first was in 2012 and the reason for not adopting it then was the lack of support for it and better alternative approaches.

5 years on and its slow progress. Google has shown great support, so much so that they built the https://patents.google.com/ site using it and Chrome is currently the only browser that has full native support.

Another example is the relative time caption on everyone's favorite site, Github.

relative-time web component

This is used throughout the site and is used to localise the given date depending upon the users browser settings. Notably, if the user has javascript disabled it will also fall back to the value set in the relative-time custom element.

<relative-time datetime="2017-02-05T06:26:26Z" title="Feb 5, 2017, 6:26 AM GMT">9 days ago</relative-time>

Clarification and avoiding confusion

Web components are often confused as a custom element, however as noted in the w3c standards specification, this is actually just one of four technologies used in the makeup of a web component. Collectively they include;

Custom Elements

<my-navigation>hello world</my-navigation>

HTML Templates

<template id="navigation">
  ...
  <li>home</li>
  ...
</template>

Shadow DOM

this.createShadowRoot();

HTML Imports

<link rel="import" href="navigation.html?v=1" async />

Now put it all together and you have a native, reusable UI component.


Shadow DOM

The shaddow DOM allows for encapsulation of a component. As the following example demonstrates, all styles defined inline below will be applied to the given markup and will not effect global scope.

var root = this.createShadowRoot();
root.innerHtml = '<style> * { color: red; } </style> <p>Hello</p>';

Browser support

Disclaimer: Today however it is worth noting the web component technical draft is incomplete and has been known to change so follow the draft with caution.

The features above are used to create a web component, however each have there own level of browser support and adoption. Therefore often web components are partially implemented or in most cases are just custom elements.

I wont spend too much time in detailing browser support as better resources can be found elsewhere and I would find it difficult to keep the content up to date.

There are various pollyfills available to plaster over the holes cautious browsers have left, but they come typically at a performance cost. However, depending on what your use cases are, there are libraries that exist to make the creation of web components easier. These are the most popular;

  • Polymer provides a set of features for creating custom elements.
  • SkateJS is a JavaScript library for writing web components with a small footprint.
  • X-Tag is an open source JavaScript library that provides an interface for component development.

Polymer for example created a lighter weight shim for the shaddow dom called shady dom, more on that here.


And finally...

Web components certainly wont suit all projects and I'm not saying you should adopt blindly. If I were to create a green field SPA I would use a framework like React or Angular and take advantage of how they create re-usable components.

However the benefits of web components are that they utilise technologies that are native to the browser. This can be really compelling when working with mirosite's that each require consistant branding but have their own client side frameworks. In this case we're using native behavior and no reason to force additional dependencies upon them.

Im currently working on a Progressive Web application which will demonstrate real world usages of web components, so once done ill add the link to the post.

Thanks for reading. Please share any comments, experiences and links to related projects or posts.

]]>
<![CDATA[Serverless Architecture]]>If you are reading this post in July 2016 (when the post was written), you will probably agree serverless architecture is H.O.T right now.

Although the concept has been well used for quite sometime, we now have an alias and title associated. So as a result we have

]]>
https://blog.zirafon.org/serverless-architecture/5b8c34071c30e311fc2499bfWed, 13 Jul 2016 22:22:12 GMTIf you are reading this post in July 2016 (when the post was written), you will probably agree serverless architecture is H.O.T right now.

Although the concept has been well used for quite sometime, we now have an alias and title associated. So as a result we have a meetup we can now go to, a conference where we can be preached to and of course this all leads to the proverbial bandwagon to jump on. - This is where we are right now, so lets ask the question What really is Serverless Architecture and how and when should we use it.

What

To begin with the name serverless can be a little misleading. But in short, it's typically a small application or unit of work that itself and its dependencies run on server's that you don't manage or concern yourself about.

But what's new about that right? Prior to the recent #devops movement, developers never really concerned themselves about consistency, availability and partition tolerance anyway.

But serverless takes it to the next level in the form of BAAS and FAAS where there is absolute no consideration to scaling and availability. Your cloud provider takes care of this with a typical 4x9 uptime SLA.

But for a better definition and insight to this, Martin Fowler and thoughtworks will give you greater detail than I will. - And it really is comprehensive. This in itself suggests the relevance of serverless and its expected impact in future software solutions.

How

FAAS

Functions as a service is quite simply a method of executing code, or a unit of work, as a service.

Serverless can often be better known as FAAS. Demonstration by the tweet below; it's certainly a common opinion.


I for one first started seeing the term used more often when last year AWS released [API gateway](https://aws.amazon.com/api-gateway/), a managed service that alongside Lambda, offers "serverless" APIs.

The benefits are obvious, just as PAAS has brought cost effective benefits, FAAS gives you a similar model. You pay for how much you use.

BAAS

Backend as a service is a simlar approach to FAAS but at a larger scale and covers more domains. A classic example is using authentication as a service. Auth0 for example packages all the complexities involved as a oAuth provider; token persistence, user management, token serialisation and offers it as and when you need it.

I've come accross many custom implementations of oAuth, open Id and identity providers and all have their own faults. - Due to the complex nature of the topic. - Its a perfect candidate to offload to an expert.

But there are many more use cases. Consider the regulations you have to oblige to when implementing payment providers or simply taking your end users personal details.

This isn't just a solution for lazy software developers, this is a smart product decision to offload business value to an expert.

When

The problem I have with this architecture, whether we are using AWS Lambdas or Azure functions, blob storage or an S3 Bucket. We are building software driven by a chosen cloud provider.

My concern with building software cloud first and tying your product to a 3rd party, is less about availability and more about losing control.

AWS for example sets the minimum supported node runtime for lambdas, what happends when they decide to increase it? All of a sudden your application needs a rewrite.

Although cloud provider costs are compelling now, what happens if this changes and your tied to a given cloud platform? It quickly becomes unsustainable.

However, serverless and the adoption of the cloud offers excellent value and current pricing tiers for serverless features/products, allow us to build enterprise solutions on startup budgets.

... enterprise solutions on a startup budget

These are some considerations not to overlook when choosing to adopt serverless architecture. We're currently using it quite successfully but as always there are tradeoffs to consider. I hope blog a followup, detailing the tradeoffs and disadvantages I have experiences.

Please share your links and examples below and thanks for reading.

]]>
<![CDATA[Intro to git hooks]]>Git hooks are scripts that Git executes before or after events such as: commit, push, and branch. Git hooks are ran locally and is a native feature so no additional dependencies.

Using hooks are really trivial and the scripts themselves are limited only by your imagination.

Below is a quick

]]>
https://blog.zirafon.org/git-hooks/5b8c34071c30e311fc2499bdTue, 12 Apr 2016 16:45:27 GMTGit hooks are scripts that Git executes before or after events such as: commit, push, and branch. Git hooks are ran locally and is a native feature so no additional dependencies.

Using hooks are really trivial and the scripts themselves are limited only by your imagination.

Below is a quick rundown on how to work with git hooks.

mkdir mygit-workflow
cd mygit-workflow
git init

Git init creates an empty repository and a handful of example hooks are copied into the .git\hooks directory. However these are disabled by default. Lets take a look at them.

cd .git\hooks\
ls
Name
----
applypatch-msg.sample
commit-msg.sample
post-commit.sample
post-receive.sample
post-update.sample
pre-applypatch.sample
pre-commit.sample
pre-push.sample
pre-rebase.sample
prepare-commit-msg.sample
update.sample

Now although these are just samples they can be pretty useful out of the box. For example if we take a look at the pre-commit.sample.

cat pre-commit.sample

This basically does a diff on the changed files and prevents you from checking in filenames with non-ascii characters. This can be helpful when a project is running over different platforms.

To enable one of the sample hooks. Just remove .sample from the filename.

mv pre-commit.sample pre-commit

These are the current hooks you can write a script to attach to:

Use cases

Like i've mentioned the usage of git hooks are limited only by imagination however I would advise to proceed with caution and remember to try and choose the right tool for the job. Just because you can, it doesn't always mean you should.

I've know of scenarios whereby teams use the post-receive hook to publish code to a production environment. Typically this isnt advised as there are better products/ solutions to manage your CI/CD strategy.

With this in mind, here are a couple.

  • pre-commit - Check commit message and enforce project message rules
  • post-commit - Create a slack notification to notify the team of pending changes to the remote.

Im going to follow this post up with a working example soon, until then you might find this one useful to get you started.

]]>
<![CDATA[Deleting those dam node_modules on windows]]>If you use node on windows the chances are that you've come across the following error when trying to delete the node_modules directory or any of its parent directories.

The specified path, file name, or both are too long

or

The specified path, file name, or both are too

]]>
https://blog.zirafon.org/deleting-windows-node_modules/5b8c34071c30e311fc2499beWed, 30 Mar 2016 09:00:13 GMTIf you use node on windows the chances are that you've come across the following error when trying to delete the node_modules directory or any of its parent directories.

The specified path, file name, or both are too long

or

The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.

I previously used various hacks to get round this, including;

robocopy new_empty_directory node_modules /MIR

Which creates an empty mirror directory that you can then delete.

However, I recently came across the node module rimraf, created by NPM founder Isaac Schlueter. Annoyingly this project first started in Feb 2011 which means ive had a few years of unnecessary hurt but that doesn't necessarily mean you have too! :)

Check the projects wiki out for better documentation but generally all you will have to do is the following.

Install globally.

npm install rimraf -g

Then

rimraf .\node_modules\

will recursively remove all modules and its dependencies.

Thanks for reading.

]]>
<![CDATA[Hexagonal Architecture]]>Hexagonal Architecture was formally penned by Alistair Cockburn in the late 90s but really the name Hexagonal Architecture is a little misleading; it can lead us to believe that our application should be defined by 6 known sides.

An example of what Hex. is not!
Misleading assumption of Hexagonal Architecture

But this can be confused

]]>
https://blog.zirafon.org/hexagonal-architecture/5b8c34071c30e311fc2499bcTue, 22 Mar 2016 12:31:58 GMT

Hexagonal Architecture was formally penned by Alistair Cockburn in the late 90s but really the name Hexagonal Architecture is a little misleading; it can lead us to believe that our application should be defined by 6 known sides.

An example of what Hex. is not!
Hexagonal Architecture

But this can be confused for nTier architecture. A pattern typically associated with a monolith application and it's important to understand that this is fundamentally a different approach.

The name Hexagonal is a metaphor for elements intersecting but it is otherwise known as Ports & Adapters and for me, better explains the principals and helps us understand the fundamentals.

Known problems with nTier architecture are with dependencies between layers which lead to leaky abstractions. The classic example of this is when we reuse our Domain models over presentation and data layers.

However Hex. Architecture suggests you create your application to work independently of a UI, database or CLI for example. The quote below is taken from Alistair's website and summaries quite well as to how the architecture is defined.

Allow an application to equally be driven by users or programs and to be developed and tested in isolation from its eventual run-time device

a. cockburn

This approach forces each port to be independently testable and subsequently helps us build reusable software.

This is what makes Hex. so compelling for me. I am currently working for a client who uses a dozen different CM and CRM systems. Using Hex. architecture allows me to build the desired logic independently of either of these systems, write automated tests to validate the behavior and then come the time we want to integrate there will be no surprises.

Hexagonal Architecture

Hex by Example:

Im working on doing a lightning talk on Hex. Architecture with an accompanying demo so ill follow up this post with those slides and reference to the repo.

For now, below is designed to help you understand how many ports would be used to create a coffee machine.

A coffee machine controller has four natural ports: the user, the database containing the recipes and prices, the dispensers, and the coin box

Hexagonal Architecture

Ports and Adapters can further be defines as the following while using Pseudo-code to demonstration the relationship.

Ports are interfaces

interface ICoinBox {
   bool Insert(IAmAMonetryValue value);
}

Adapters are implementations

class GbpCoinBox : ICoinBox {
   bool Insert(Gbp value) {
      ...
   }
}

Putting it all together

It is all glued together using Bob Martin's Dependency Inversion principal. A modular approach for decoupling module implementation details. Easily allowing us to interchange test harnesses for user interfaces as and when required.

Net result is well made coffee.

Hexagonal Architecture

Thats it for now. Thanks for reading. Please leave comments or links to related posts.

  1. http://alistair.cockburn.us/Hexagonal+architecture
  2. Hexagonal Architecture example
]]>
<![CDATA[Reactive Programming]]>I recently did a presentation on reactive programming with examples of Rx.Net. The presentation gives a basic intro into the key concepts whilst attempting to help you understand what benefits Rx will bring to your life's.

http://slides.com/chrisbrown-3/reactive-programming/#/

Any queries please let me know. I've included

]]>
https://blog.zirafon.org/reactive-programming/5b8c34071c30e311fc2499b9Tue, 19 Jan 2016 20:01:15 GMT

I recently did a presentation on reactive programming with examples of Rx.Net. The presentation gives a basic intro into the key concepts whilst attempting to help you understand what benefits Rx will bring to your life's.

http://slides.com/chrisbrown-3/reactive-programming/#/

Any queries please let me know. I've included some useful links from the presentation so if you dont read the slides and your interested in Rx then please follow them.

Thanks for reading.

]]>
<![CDATA[Visual Studio Code Snippets using ReSharper Live Templates]]>

A really useful yet often under-utilised feature of the Resharper toolbelt are Live Templates. Its effectively yet another productive time saving technique that using Reharper brings.

Its straight forward, and the example below I use almost everyday. Its used to create a unit test stub to support GivenWhenThen.

Goto: ReSharper-&

]]>
https://blog.zirafon.org/resharper-live-template/5b8c34071c30e311fc2499baTue, 19 Jan 2016 08:58:49 GMT

A really useful yet often under-utilised feature of the Resharper toolbelt are Live Templates. Its effectively yet another productive time saving technique that using Reharper brings.

Its straight forward, and the example below I use almost everyday. Its used to create a unit test stub to support GivenWhenThen.

Goto: ReSharper->Tools->Templates Explorer...

As Highlighted below. If the test scope doesnt exist then select the New Template icon on the toolbar. If it does then simply select and edit.

This takes you to the template itself where I added the following.

[Test]
public void $Method$_$Scenario$_$Expected$(){
	// GIVEN:
	$END$
	// WHEN:
	// THEN:
}

And thats! This then gives you the following code snippet.

Really simple and incredibly effective. The example above is primitive but this shows better examples of how powerful it can be.

]]>
<![CDATA[Static Azure Website]]>So I recently blogged about creating a bespoke blog for next to nothing. It ended up effectively being a static site hosted on github pages. So I thought why couldnt we do something similar with MSFT Azure blob storage?

I've used blob storage before but for its typical use; storing

]]>
https://blog.zirafon.org/static-azure-website/5b8c34071c30e311fc2499b8Mon, 04 Jan 2016 01:28:35 GMTSo I recently blogged about creating a bespoke blog for next to nothing. It ended up effectively being a static site hosted on github pages. So I thought why couldnt we do something similar with MSFT Azure blob storage?

I've used blob storage before but for its typical use; storing website assets with a CDN in front, but I dont see why this shouldn't work.

First of all its important to understand that Azure Blob Storage is not a web server so we immediately loose the benefits IIS brings. For example setting a default document. If I navigate to the root container ill get a 404. So we will have to be explicit in referencing our filename, in this case /index.html.

Another related issue is how will the browser know how to render the blob, seen as its not hosted on a webserver, what would the response headers be? A quick confirmation of this was to simply upload a test page and navigate to it.

Azure Explorer

And Navigating to it returns the following response headers.

HTTP/1.1 200 OK
Content-Length: 1492
Content-Type: text/html
Content-MD5: RWjh1LBpfG99DCHD+W1raQ==
Last-Modified: Mon, 04 Jan 2016 01:04:23 GMT
ETag: 0x8D314A2FC4F3940
Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0
x-ms-request-id: 6e1b9f58-0001-0092-3091-46e306000000
x-ms-version: 2009-09-19
x-ms-lease-status: unlocked
x-ms-blob-type: BlockBlob
Date: Mon, 04 Jan 2016 01:47:42 GMT

Success!

So why would you want to consider this? Well its not suitable for all projects because of the limitations mentioned above but if the project suits a static site then the infrastructure that [Azure](https://azure.microsoft.com" target="_blank) and other notable cloud providers give you, is certainly worth exploring.

With the example below, I uploaded a static spa site, generated using [generator-react-static](https://www.npmjs.com/package/generator-react-static" target="_blank). This is to demonstrate what is possible.

[https://zirafon.blob.core.windows.net/index.html](https://zirafon.blob.core.windows.net/index.html" target="_blank)

The site will render basic markup and the client will retrieve the data it needs from light weight API's.

static site architecture

This can obviously scale as much as you need it to.

site at scale

So this becomes a very affordable enterprise solution minus the enterprise prices. Thanks for reading.

]]>
<![CDATA[My new $1 Blog]]>https://blog.zirafon.org/my-new-blog/5b8c34071c30e311fc2499b3Sat, 26 Dec 2015 17:44:24 GMT

As a self employed software developer it is essential to have some form of web presence.

I previously had a bespoke website using latest tech and hosted on azure used primary as a portfolio piece that was attached to my email domain.

Alongside this I had a technical blog at blogspot. I found keeping both up to date required more time than I had. As a result i decided to amalgamate.

Apart from anything, I didn't like the fact my blog was detached from my personal domain. As a developer if you blog, you need to own that content and the obvious way to do so is to ensure it is housed under your personal domain. In my case this was blog.*.

As a developer if you blog, you need to own that content ...

So the next question is, which blog engine do I use. Because I don't know how often I would blog, it needed to be 1. free and preferably 2. open source. It also needed to be 3. fast and because im putting my name and reputation as a software developer it needs to be 4. extensible so I can customize it. The final requirement is that I wanted it to be a 5. markdown blog engine.

Blog Engine

1. Free

As previously mentioned im moving from Azure which is a paid product, not expensive but because the blog is effectively a static site then why do I need heavy compute power? I don't and while I already knew of free blog offerings, most platforms have paid plugins. For example having custom domains etc. So I set a challenge for myself to see how cheap I can keep it, and preferably lets keep it free without losing functionality.

...lets keep it free without losing functionality.

2. Open Source

Im pretty passionate about open source software. I've profited from it previously and contributed where appropriate. But typically open source projects have best community support and starting out on a new project, this is important! And obviously this helps with being 4. extensible.

3. Fast

This is an obvious one, it just makes for an awful UX experience. There are various blog engines out there that focus on this as a main feature but they always come at a trade-off.

4. Extensible

It's easy to extend and refactor open source work but once you fork from the source then it's difficult to keep up to date with new versions. I want to be able to use the engine of the blogging platform but I dont want to be tied to the default theme, I want to painlessly make a bespoke solution from a base product.

I want to painlessly make a despoke solution from a base product

5. Markdown

Markdown is the defaco readme language and is something I use daily and im hopeful it will make blogging faster and allow me to create rich content with a small amount of effort.

.. rich content with a small amount of effort.

So to cut a long story short, I choose Ghost. This blog is less about why than how and you can find lots of writing about the same subject here.

Solution

So the title of this blog is My new $1 Blog and this is largely true. Its effectively the cost of a domain, everything else is free.

Im really pleased with the result so far, it's pretty basic at the moment but its work in progress. Please get in touch i'll be keen to hear your thoughts of the theme. Any issues can be added here. Thanks for reading.

]]>
<![CDATA[Conflicts with Umbraco 7 install and ReSharper.]]>Im currently working on a project that abstracts away from the customer facing UI AKA front end. We are effectively utilising Umbraco as a service rather than as a platform. Because as a CMS, its incredibly powerful and functional but these benefits come at a cost for the customers interacting

]]>
https://blog.zirafon.org/conflicts_with_umbraco_7_install_and_re_sharper/5b8c34071c30e311fc2499b4Sat, 14 Mar 2015 21:20:00 GMTIm currently working on a project that abstracts away from the customer facing UI AKA front end. We are effectively utilising Umbraco as a service rather than as a platform. Because as a CMS, its incredibly powerful and functional but these benefits come at a cost for the customers interacting with the front end. Typically its performance but there are a whole heap of issues that effect large sites that im not going to go into on this particular blog.

This post is about Umbraco 7 installation. Its an obscure one so I thought I would mention it here to aid folk in the future.

Basically when you install  and you come across the following error.

Install failed. Rolling back... 
install-package : Expected "$(_PublishProfileSet)" to evaluate to a boolean instead of "", in condition "$(_PublishProfileSet) And '$(PublishProfileName)' 
=='' And '$(WebPublishProfileFile)'==''". C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v12.0\Web\Microsoft.Web.Publishing.targets 
At line:1 char:1 
+ install-package umbracocms
+ CategoryInfo : NotSpecified: (:) [Install-Package], InvalidProjectFileException 
+ FullyQualifiedErrorId : NuGetCmdletUnhandledException,NuGet.PowerShell.Commands.InstallPackageCommand

It's a known conflict error between Resharper and the install. Further details can be found here. But in brief. Disable Resharper and try again.

Worked for me. I believe the issue is resolved in v9 of ReSharper but further comments would be appreciated.

To disable/Suspend ReSharper

Navigate to:  Tools -> Options -> ReSharper -> Suspend Now

alt

]]>
<![CDATA[Intro into the OWIN specification and a Hello world example of Katana]]>Getting Started with OWIN and Katana

Firstly, lets start by trying to identify what the specification is. The following text is taken from owin.org pecification is. The following text is taken from owin.org.

OWIN defines a standard interface between .NET web servers and web applications. The goal of

]]>
https://blog.zirafon.org/intro_into_the_owin_specification_and_a_hello_world_example_of_katana/5b8c34071c30e311fc2499b6Tue, 03 Mar 2015 22:58:00 GMTGetting Started with OWIN and Katana

Firstly, lets start by trying to identify what the specification is. The following text is taken from owin.org pecification is. The following text is taken from owin.org.

OWIN defines a standard interface between .NET web servers and web applications. The goal of the OWIN interface is to decouple server and application, encourage the development of simple modules for .NET web development

Or in other words; Katana, the Microsoft implementation of OWIN, is basically an answer to node.js just like the ASP.NET MVC framework was the answer to Ruby on Rails**

This is evident when we compare the two hello world web server implementations;

1. Node Web server initiation

2. Katana self hosting web server

Now this is evidently a high level comparison between the 2 but it suggests my point.

Katana, is a collection of projects for supporting and implementing the OWIN specification. The value here is that it creates an abstraction between the web server and application. Meaning it completely decouples server dependencies away from our application. This then introduces the concept of, use what we need and nothing more.

Significantly it includes OWIN support for System.Web and System.Net.HttpListener.

This is an obvious benefit when you review a typical .NET application and notice the dependent reference System.Web.dll; which is over 2.5MB and includes over 10 years of retrofitted development.

Where node uses Node Package Manager, Microsoft uses NuGet to manage these modules.

For this blog post, this is where the comparison stops between these two technologies. I dont want to dwel upon it, however for further readings checkout this performance comparison article; NB. At the time of this blog publication the article was over a year old so the result will no doubt vary.

MSFT's decision to open source their forthcoming ONE ASP.NET platform has meant the open source community has supported the Katana implementation well. Significant support that you can pull down today includes;

I've used a couple of these projects on a basic hello world application. Its a self hosted project running from a console app. But a follow up post will include a Linux host. This is available on GitHub for downloading, forking, or contributing.

View on GitHub

]]>