Pivotal CF (Cloud Foundry) Tutorial Part 2

Now that you have built your app we need to add a database. Use the following command to see what services you can create:

> cf marketplace

You should see something like this.
service   plans       description
p-hd      Standard    Pivotal HD is the industry’s most full-featured Hadoop distribution.
p-mysql   100mb-dev   MySQL service for application development and testing  Continue reading Pivotal CF (Cloud Foundry) Tutorial Part 2

Pivotal CF (Cloud Foundry) Tutorial Part 1

Here’s a tutorial on how to get started with Pivotal CF. In part 1 we will focus on deploying a pre-built app to the cloud.

Before you begin

Download the “CF” command line tool from: https://console.run.pivotal.io/tools

Download the sample Java app that we will be using for this exercise: booking-mvc.war.zip

Unzip the file

1. set your target API to the api address of the cloud foundry interface. In this case our domain is cf.mattzwol.com

> cf api api.cf.mattzwol.com

note: you may need an extra command here in the line above to skip ssl verification. It will tell you if you do. In this case use > cf api  api.cf.mattzwol.com –skip-ssl-validation

Continue reading Pivotal CF (Cloud Foundry) Tutorial Part 1

Build Your Own PaaS – At Home – Part 2!

Deploying Pivotal Cloud Foundry in Your Own Home Lab

I wanted to see what it would take to deploy Pivotal Cloud Foundry (Pivotal CF) and all the cool things that Bosh brings. I used a very minimal config to prove that a small footprint would work. Here is a summary of my setup:

Desktop 1: Intel Quad Core i7, 32 GB RAM, 2TB Internal Drive
Desktop 2: Intel Quad Core i7, 32 GB RAM, 2TB Internal Drive
Synology DS1513+ NAS Device with 3 x 240GB Intel SSDs in a RAID-5 config
24 Port HP Gigabit Ethernet Switch Continue reading Build Your Own PaaS – At Home – Part 2!

Pivotal and Cloudera Discuss Hadoop and Isilon

It’s not every day that you see the CEO of Pivotal and the CEO of Cloudera on stage together. Today Paul and Tom discussed where they see the future of Hadoop and how Isilon is helping them move Hadoop into the centre of the enterprise.

Paul Maritz’s comments were around the idea that Hadoop is now moving to be a critical platform for enterprises, hence the need for Isilon’s ability to make Hadoop enterprise ready.

Revenue for EMC Isilon in the Hadoop space is growing at 259% per annum.

Customers at the same session (FedEx and Adobe) talked about how Isilon drives down the Hadoop TCO by eliminating the “triple replicate penalty”, while improving availability through business continuity and eliminating data loading by being the central data hub.

Paul said it would take a while, but he can see a point where 50% of all workloads in the world are on HDFS.

The CEO of Cloudera, Tom Reilly, said that Intel’s view is that the #1 workload user of Intel CPUs today is ERP, but eventually it will be Hadoop.

A central theme was the religious debate of whether Hadoop should be used as purely a staging platform, with data being moved into warehouses for manipulation, or if Hadoop should be the manipulation engine. Clearly the view from Pivotal and Cloudera is the latter.