Part 9: Apple Watch

This is part of a series of posts related to My Internet of Things and MobileFirst adventure.  An index to all posts can be found at the end of the first post.

Ok, I know I’m getting really carried away on this thing now, but I got an Apple Watch and thought it might be kinda cool to see if I could create a Watch app for my doorbell app.

I’m going to focus on the unique challenges I ran into for this app and not get too deep into how you build a Watch app.  There are resources out there (although not as many as I had hoped) to teach you how.  One particularly good resource I purchased is a tutorials book from Ray Wenderlich called “watchOS 2 by Tutorials“.  This is definitely worth the price of admission if you are serious about learning how to build for the Watch.  One caution I will leave you with – make sure whatever resources you are using are for WatchOS 2.  There were significant changes between version 1 and version 2 and there is still quite a bit of WatchOS 1 info out there.

There are three features of the Doorbell app I would consider making available on the Watch.  Well, actually, there are really only three features in the whole Doorbell app:

  1. Push notifications when someone rings the doorbell.
  2. Displaying a picture from the Raspberry Pi
  3. Displaying video from the Raspberry Pi

It didn’t take me long to see that #3 is a long shot.  There are very few video viewing apps for the Watch and those that do exist provide the video content as files transferred to the Watch.  I considered rearchitecting things to capture short video clips to files that could be sent to the Watch, but I pretty quickly wrote that off.  I’ll focus on Push notifications and pictures.

A Watch app is not a separate app at all.  It is really an extension of the iPhone app.  A Watch app cannot exist without its “host” iPhone app.  You get started by adding a Watch target to your existing iPhone app.

Display a picture

My plan was to put a button on a Watch interface that would request a picture from the iPhone app and would then display it in a WKInterfaceGroup or something.  Creating the UI was pretty easy – there are a lot fewer options than on iOS.

The WatchConnectivity package gives you various ways to communicate between the iPhone and the Apple Watch.  Interfaces can be immediate (sendMessage) or background (transferFile, updateApplicationContext).  My first instinct was that I wanted it immediately so I should use sendMessage.  This actually did work, but sendMessage has a size limit on the payload.  I never did fine an authoritative source that defined the specific value, but unless I really reduced the resolution on my picture, I exceeded it.

So, I took the approach to have the Watch request the picture using updateApplicationContext.  The iOS app would then send the takePicture command to the Raspberry Pi, receive the picture in packets, reassemble it, then send it to the Watch using transferFile.  This actually worked brilliantly – as long as the iOS app was in the foreground.

This is where I ran into another constraint from Apple.  The iOS app is VERY limited in what it can do when in the background.  Namely, it can only use the network for vary limited reasons.  Publishing MQTT commands doesn’t seem to be one of them.  There are other ways to do this.  You can have an iOS app do some network operations if you can justify classifying it as a VOIP or News app.  Then it can do some http operations in the background.  I’m sure that’s how Watch apps such as weather apps get regular updates from the host app.

Nonetheless, I did get something to work as long as the iOS app is active.  I can get a picture from my Raspberry Pi to the Watch.

Refresh  Requesting  Returned

Push notifications on the Apple Watch

I learned that there is an involved set of rules that determines when a Push notification shows up on the Watch and when it shows up on the iPhone, even if you have built in the ability to handle the Push on the Watch.  You don’t get to choose – iOS makes the decision.  Basically, the notification will always go to the iPhone if it is not locked and asleep.  Even if it is, the Watch must be on your wrist before iOS will decide to let the Watch present the notification.  Makes sense I guess.  Why would you want a notification to show up on your Watch if you have the iPhone in your hand and you are looking at it?

The Watch app has an ExtensionDelegate class which is analogous to the AppDelegate of the iPhone app.  You override the method handleActionWithIdentifier to handle the custom action buttons (Picture and Video) I created in the AppDelegate class.  Now when the doorbell is pressed, Node-RED on the Raspberry Pi will send a request to the IBM Push Notifications service which will in turn have APNS send a push.  If the iPhone is asleep in my pocket, I will get a Push notice on the Watch with the Picture and Video buttons.

WatchPush

The problem is, what do I do now that I have received the Push on my Watch?  Since I already found that I can’t ask the iPhone app to go get me a picture and then update it on the Watch, I am kinda stuck.  I did end up implementing Handoff so once I tap either Video or Picture on the Watch, I can slide up on the lock screen icon on my iPhone and have it take me directly to the corresponding view and refresh it.

LockScreen

So as it turns out, the Apple Watch probably isn’t the ideal platform for a visual doorbell app.  But it was an interesting experiment.  At this point, I think I have milked this adventure for all it is worth.

Conclusion

Well, it has been quite an adventure.  I can’t say that I have a terribly useful app at the end of all this, but it did give me a platform to learn a lot of stuff.

If you are interested, the full code for the project can be found in GitHub.

 

Part 8: Watching video from the Pi on the Phone

This is part of a series of posts related to My Internet of Things and MobileFirst adventure.  An index to all posts can be found at the end of the first post.

Let me say right off that there are dozens of alternatives available if you want to stream video from a Raspberry Pi to another device.  They all seem to have their pros and cons.  Some are really inefficient.  Others require a lot of processing and an additional app server.  Still others are very proprietary and only work on a very limited set of devices or platforms.  So, feel free to try your own approach.

I decided to go with the VLC media player.  I have known VLC as a good media player for a long time, but I had no idea it can do so many more things – in particular, act as a video streamer.

So here’s the basic plan:

  1. Configure the Raspicam to record video to a fifo file.
  2. Configure VLC on the Raspberry Pi to stream that fifo content to a URL
  3. Connect a VLC player object embedded in the iPhone app to the URL stream.

Install VLC on the Raspberry Pi

You can install the Debian image on the Raspberry Pi:

sudo apt-get install vlc

That was easy.

Add the VLC SDK to the iOS app project

There’s a Pod for that.  In fact there are multiple.  I went with MobileVLCKit.

Create the Raspberry Pi Node-RED flows

Let’s start on the Raspberry Pi side and deal with sending the command from the app later.  Let me warn you, this gets a little crazy.  There are several steps here that I probably could have done more simply in a shell script.  But I tried to stay in Node-RED as much as I could, which ended up complicating some things.  Here we go.

Start Stream Flow

Create the following flow:

 

 

I warned you.  Let’s go through the nodes:

  • startStream  Receive the IoT Foundation startStream command from the application.
  • mkfifo  Creates a fifo file on the filesystem.  This is an ‘exec’ node, which just runs the given OS command.  It returns stdout as the payload of the message, but in this case, I don’t really care about the contents of stdout.  mkfifo
  • start raspivid  Invokes the shell command “/home/pi/bin/startraspivid.sh”.  This shell script starts up raspivid recording 640×640 video at 25 frames per second.  This seemed to produce decent looking video yet kept the bandwidth requirements down.  These numbers could definitely be tweaked if needed.  raspivid is streaming to the fifo file named vidstream.  Yes, the rest of the command looks odd.  The issue was that I needed the node to launch raspivid and let it continue to run on another process yet return execution to the next node.  I tried a number of approaches and the only one that seemed to work was to background it (‘&’), but I found I also had to redirect stdout and stderr (‘ > raspividout.txt 2>&1’) or execution wouldn’t return to the node.

  • start vlc  This is similar to the raspivid node.  It invokes a bash shell script that starts up VLC (‘cvlc’).  VLC will take the vidstream fifo file as input and produce a video stream on port 8554.  I had to do the same dance with backgrounding and redirecting output as I did with the raspivid script.
    VLC has a seemingly infinite number of command line switches and arguments.  I won’t claim to have figured all this out myself.  I relied heavily on the VLC documentation and a couple blogs on the Raspberry Pi forum.

  • get IP Address  This function node figures out the IP address of the Raspberry Pi.  I need this because the iPhone app will need to know where to connect the VLC player to view the stream.  Now this brings up a significant limitation to the VLC approach – the Raspberry Pi must be visible to the mobile device on the network.  Unless you are willing to go through all the steps necessary to get your Raspberry Pi a public IP address and deal with all the security issues that will arise, that’s a pretty big limitation.  I decided I can live with the limitation that video only works while my phone is on my local Wifi network with my Raspberry Pi.

  • streamStarted  An ibmiot output node that publishes the device event streamStarted with a payload that includes the IP address of the Raspberry Pi.

Stop Stream Flow

The Stop Stream flow is invoked by receiving a stopStream IoT Foundation command from the mobile app.  It simply kills the Raspian processes for vlc and raspivid and then publishes a device event acknowledging that it did.

stopStreamFlow

Stop Stream Node-RED Flow

    • stopStream  ibmiot input node that receives the command stopStream.
    • Stop Stream  An exec node that runs the script stopstream.sh.

    • format Message  My mobile app doesn’t actually do anything with this, but I created this node to parse out the process identifiers (PIDs) of the vlc and raspivid processes that were killed and include them in the payload of the streamStopped event.

  • streamStopped  An ibmiot output node that publishes the device event streamStopped with a payload that includes the PIDs of the killed processes.

Add functionality into iOS app

Again, I won’t go into all the details here, but I want to hit the high points.  What needs to be done in the app is

  1. Add a UIView object to a view to act as the ‘drawable’ for the mediaPlayer
  2. Add a VLCMediaPlayer object.
  3. Send the startStream command through IoT Foundation
  4. Receive the IP address and have the mediaPlayer attempt to connect to it.

Add a UIView object

I decided to create a tabbed view controller, making the first tab the Picture view controller and the second the Video view controller. The Video view controler just has a UIView on the storyboard with an IBOutlet in the view controller code.

Add a VLCMediaPlayer object

With the MobileVLCKit pod added to my Xcode workspace, this is as simple as

In the view controller’s viewDidLoad method, I configure the mediaPlayer.

Send the startStream command

In the view’s viewWillAppear method, I send the startStream command.  This is really the same as sending the takePicture command. I do provide a completion handler, but this time, the handler will only be invoked once – when the streamStarted event is published by the Raspberry Pi.

Receive the IP address and connect the MediaPlayer

The callback gets invoked when the streamStarted event is published.  The payload of the streamStarted event is the IP address of the Raspberry Pi, so all that is left is to set the mediaPlayer’s media value and tell it to play.

Remember, though, that as we discussed earlier, the mediaPlayer will not be able to connect to the stream if the Pi is not visible to the mobile device. If that is the case, the mediaPlayer object throws up an alert. I could have handled that in my code with a delegate, but I decided to just let the mediaPlayer deliver the bad news.

IMG_0673

Summary

There are a lot of ways to skin this cat.  I chose to use VLC because it was relatively easy (compared to other methods I considered), but it does have the significant limitation that my Raspberry Pi has to be network-ly visible to my iPhone.  But nonetheless, while I am in the same Wifi region, it is pretty cool that when I hear the doorbell ring, I don’t even need to get up out of my recliner to see if it is a siding salesman.  The world of WALL-E is fast approaching.

So next I went a little crazy.  I got a new Apple Watch and thought I would see if I could build it into this environment in the next post.

Part 7: Requesting and Receiving a Picture

This is part of a series of posts related to My Internet of Things and MobileFirst adventure.  An index to all posts can be found at the end of the first post.

The next step is to see if we can make the iOS app request a picture from the Raspberry Pi and have it respond with one.

I stared by creating a simple iOS app that had a button to start the request process and a UIImageView to display the picture.  The UI eventually evolved a bit, but this gave me a place to start and keep it simple.

MQTT Mobile Client software

There is no iOS SDK specific to the IBM IoT Foundation.  There are, however, several MQTT pods out on cocoapods.org that could do the job.  MQTT is the protocol underlying the messaging in the IoT Foundation.  I chose to use MQTTClient.  I followed the simple instructions to install using cocoapods.  MQTTClient provides interfaces at various levels.  The simplest way to use MQTTClient is through MQTTSessionManager.

IBM IoT Foundation conventions

The IoT Foundation does layer on some extensions that are supported by conventions you must observe when using MQTT. The full IoT Foundation documentation is very helpful.

First of all, the IoT Foundation considers there to be two types of “things” in the Internet of Things: devices and applications.

Devices:

  • A device can be anything that has a connection to the internet and has data it wants to get into the cloud.
  • A device is not able to directly interact with other devices.
  • Devices are able to accept commands from applications.
  • Devices uniquely identify themselves to the IoT Foundation with an authentication token that will only be accepted for that device.
  • Devices must be registered before they can connect to the IoT Foundation.

Applications:

  • An application is anything that has a connection to the internet and wants to interact with data from devices and/or control the behaviour of those devices in some manner.
  • Applications identify themselves to the IoT Foundation with an API key and a unique application ID.
  • Applications do not need to be registered before they can connect to the IoT Foundation, however they must present a valid API key that has previously been registered.

For some reason, it took me a while to wrap my head around this.  I wanted to consider my iPhone a “device”, but in fact, it would be an “application”. For one reason, it is going to be sending commands to get information from the Pi.  Secondly, note that you don’t register applications in advance – they just provide a unique key.  This would be important if I were to do into production with my doorbell.  I can’t register every possible mobile phone in advance.

Commands:

  • Commands are the mechanism by which applications can communicate with devices. Only applications can send commands, which must be issued to specific devices.

Events:

  • Events are the mechanism by which devices publish data to the Internet of Things Foundation.

So my iOS app will be an application that will send commands and subscribe to events that will be published by my Raspberry Pi which is a device.  Got it.

Here are some other MQTT concepts and their manifestation in IoT Foundation.

Connection parameters

  • MQTT host:  The host of the IoT broker is org_id.messaging.internetofthings.ibmcloud.com where ord_id is the org assigned to your IoT Foundation service when it was created.  You can find it from the Bluemix dashboard for your application.
  • MQTT client identifier:  This is an identifier the application must provide when it connects to the IoTF.  it must be unique.  The convention is a:org_id:app_id, where ord_id is the same as above, and app_id is something unique for each app instance.  Since there will only be one app instance at a time for my simple app, I just used the Bluemix AppID for this.
  • MQTT username: Use the IoT Foundation API Key generated when you created your IoT Foundation service.
  • MQTT password: Use the IoT Foundation Authentication Token

Topic names

The IoT Foundation uses a strict convention for MQTT topics that map to device types, device IDs, commands, and events.  Reference the documentation for all the details, but here are the topic strings I used for the command I would send from the application and the event I would subscribe to:

Connecting to IoT Foundation from the Application

Again, sparing some of the details, here’s my connect() method:

Receiving the picture

First step is to send the takePicture command to the Raspberry Pi through the IoT Foundation.  The actual content of the message is irrelevant in this case.  The operation on the Pi will be invoked just by receiving the command.

The picture will come back from the Raspberry Pi as a Bas64 encoded string as the data in the pictureTaken event.  This wasn’t as easy as I expected.  Turns out that MQTT has a limit to the size of the messages it will transmit.  Who knew?  Well, I didn’t anyway.  So, as you will see shortly, I break the picture up into 3kb chunks and send them back in sequence.  So on the application side, I created a completion handler that would piece the picture back together again.  The completion handler gets called each time a packet event arrives.

The MQTTSessionManagerDelegate routine handleMessage gets invoked each time a packet arrives, which then invokes the callback.

The callback is smart enough to know when it receives the final packet and then completes the reconstruction of the image and eventually displays the picture in the UIImageView.

Servicing the command from the Raspberry Pi

There are Node-RED nodes available to communicate directly to the IoT Foundation from within Node-RED.  Follow the instructions for node-red-contrib-scx-ibmiotapp to set it up in the Raspberry Pi.

With the ibmiot nodes in place, create the following flow:

Node-RED flow that takes a picture

Some details:

  • takePicture  This is the IoT node that receives the takePicture command from the application.  Configure it with the requested credentials.  The device type and device id must match those you created for the Raspberry Pi in the IoT Foundation dashboard.  The command is “takePicture”.
    takePictureNode
  • Take Picture  This function node is pretty much the same as in an earlier blog post.  It invokes the camera through JavaScript in the Node.js environment.  It sends on a message containing the picture’s filename and timestamp.

  • delay 2 s  The camera is asynchronous so you need to give it a little time to take the picture and save it to a file before processing.
  • Base64 encode  This is an exec node which runs an operating system command.  The command value is “base64 -w0” and the msg.payload value is checked.  This means an OS command base64 will run against the filename provided by the Take Picture node.  “-w0” keeps base64 from injecting newline characters in the string.  The Base64 string is sent to the next node as the payload.
    Base64Node
  • splitIntoPackets  This function node slices the payload into an array of messages containing 3k chunks of the data.  The date, picture name, number of packets and packet index are all added to each message as well.  The array becomes a series of messages sent by the node.  The node also has an output that sends the picture name and one that sends the total number of packets.  These are just for debugging purposes.

  • pictureTaken  This ibmiot output node sends the device event “pictureTaken” for each packet.  Use the same values for authentication, device type and device id that you used for the ibmiot in node.
    PictureTakenNode

Where are we?

Ok, we covered a lot of ground in this post.  We talked a bit about MQTT and how IoT Foundation fits with it.  We looked at how the Swift code will use the MQTTClient library to send commands and receive picture packets and reassemble them.  We also looked at how the Raspberry Pi will receive the IoT command through the IoT input node, use the Raspicam Node module to take a picture, use an OS command to convert it to Base64, use a function node to packetize it, then use the IoT output node to send the packets back to the phone as events.

I didn’t show it here, but I added some code to have the iOS app invoke this whole process in response to the user’s choice from the Push notification.  So I accomplished what I had set out to do.  The Pi sends a push notification to the phone to let the user know there is someone ringing the doorbell.  The user can then choose to see a picture of the visitor by tapping Picture in the push notification.  The app will then send an MQTT command to the Pi through the IBM IoT Foundation.  The Pi then takes a picture and sends it back to the phone which displays it.

Next

I decided to go for some bonus points.  What if the user could actually watch video of the visitor?  Next post.

An example of using Git, IBM BlueMix DevOps Services and MobileFirst Studio

I introduced some basic concepts around using Git with IBM MobileFirst Platform in the last post.  This post will build on that through an example using IBM Bluemix DevOps Services as the host for my repository.

Getting started

First, go to https://hub.jazz.net/. If you don’t have a free account, create one using the SIGN UP button.

Sign Up Button

Once logged in, you will see a screen showing you all the projects you have on DevOps Servcies. Create a new project.

Create Project

If you have code in a GitHub project already, you can just link to it. Let’s assume you do not, so click Create a new repository and choose Create a Git repo on Bluemix. Accept the defaults to create a private project with Scrum features and don’t make it a Bluemix project (that’s a discussion for another day).

03 - create repo

You now have a project on DevOps Services to contain your project. The Git URL needed to reference this repository can be copied from the Git URL link on the upper right corner of the Git view. Do that now because you will need it shortly.

git url

The next steps are done from your local machine.

Adding a project to Git

From the client side, it is recommended to NOT store your projects within the Eclipse workspace. Unfortunately, the Studio New Project wizard doesn’t give you the option to create the project anywhere else. Fortunately, Eclipse projects are portable and can be moved around easily. The easiest way I have found to get started is to

  1. Create a local Git repo using MFP Studio
  2. Create an MFP project
  3. Move the project into the Git repository
  4. Create your .gitignore file
  5. Put everything under control
  6. Push your changes to the DevOps Services server

Let’s quickly walk through that.

Create a local Git repo

  1. Open MF Studio to a new workspace.
  2. Open the Git Perspective
  3. In the Git Repositories view, choose Clone a Git repository.
    clone
    When asked to select a destination, pick a spot on your file system where you will store Git repositories, say /git and then add a folder within it with the name of your project (i.e. /Users/dennis/git/mySampleProject). Now you have an empty Git repository that references the one on DevOps services as its “remote”.

Create an MFP project

  1. Right-click on the Working Directory node in the Git Repositories view and select Import Projects. Select Use the New Projects wizard.
  2. Walk through the wizard to create a project named mySampleProject containing a hybrid app named mySampleApp, just as you normally would.
    create project2
  3. Create an Android environment in the mySampleApp project.

Move the project into the Git working directory

Unfortunately, MFP Studio simply creates the project within the root of the workspace, which is not what you want. Fortunately, it is easy to move.

  1. Switch to another perspective such as J2EE or Design.
  2. Right-click the project and select Refactor > Move to move it within your Git repository. Note that you cannot put it directly in the Git repository root. Create a subdirectory such as /Users/dennis/git/mySampleProject/mySampleProject.

Create your .gitignore file

  1. Switch back to the Git perspective and hit refresh in the Git Staging view. You should see that you have about 217 files staged, which means they are available to add to SCM. But remember we don’t necessarily want all of them in there.
  2. Go to https://github.com/andrewferrier/mfp-gitignore and download the MFP_7.0.gitignore file.
  3. Rename the file to just .gitignore and place it at the root of your project.
  4. Go back to the Git Staging view, refresh, then drag the .gitignore file from Unstaged Changes to Staged Changes.
  5. Provide a commit message like “Setup .gitignore file”.
  6. You should see the Unstaged Changes file count drop to something like 109 because the .gitignore file is filtering out the stuff you don’t want committed.

Put everything under SCM control

  1. Select all files in Unstaged Changes and drag them to Staged Changes.
  2. Provide a commit message like “Project Load” and commit.

At this point, everything is under Git control, but only on your local repository.

Push your changes to DevOps Services

  1. In the Git Repositories view, right-click mySampleProject/Branches/Local/master and select Push Branch.
  2. Accept all defaults. When the operation finishes, go back to your DevOps Services project and refresh the browser. You will now see your project on the master branch on the server.
    project on server

Conclusion

Managing source code is critical in any project including MobileFirst projects. It takes a few steps to get setup with Git, but it will pay you dividends quickly.

Source Code Management with IBM MobileFirst Platform

This is a high level overview of managing the source code of your mobile application project in the IBM MobileFirst Platform (MFP). You should come away with enough foundational knowledge to start implementing Source Code Management (SCM) in your own project.

Why manage source code?

There are many varied reasons for using an SCM system. Some reasons are more relevant to developing code as part of a team. SCM helps:

  • Share code with others
  • Coordinate and share changes with others
  • Understand what work has been done, when, where and by whom

But, SCM provides a lot of value, even if you are a development team of one. SCM helps:

  • Organize your changes into a series of steps
  • Identify the codebase for all releases
  • Compare versions of code
  • Review history of changes
  • Compartmentalize or package specific feature or bug fixes for distribution

There are many, many reasons why SCM makes sense and those reasons apply to mobile development project as much as any other type of project.

SCM and the IBM MobileFirst Platform

There are basically two ways you can develop software for the IBM MobileFirst Platform: using MobileFirst Studio (hereafter referenced as simply “Studio”) or the MobileFirst Command Line Interface (CLI). The concepts I will be discussing apply to both. Most SCM tools provide a CLI that can be used to issue necessary commands to commit your changes, create branches, etc. if you are not using Studio.

Studio is based on the Eclipse IDE. This makes it possible to use any Eclipse SCM tool integration directly from the IDE. These Eclipse integrations provide some really useful graphical tools that help you visualize the changes to your code and merge conflicts when necessary. If the IDE integration doesn’t do everything you need, you can revert to the CLI when you have to.

For this article, I am going to focus on Studio. I’m also going to focus on the SCM system Git. Git is quite popular these days as it is lightweight and open source. The concepts I will discuss in Git can be applied to other SCM systems as well.

IBM provides some guidance on source control for Mobile First Platform in the Knowledge Center at http://www-01.ibm.com/support/knowledgecenter/SSHS8R_7.0.0/com.ibm.worklight.dev.doc/devref/r_integrating_with_source_contro.html. This page is the place to start when you are considering implementing SCM on your project. The hierarchy diagram depicts the structure of a MobileFirst project. It is important to understand this because not all files in the project should be put under SCM control.

Types of files in MobileFirst projects

Git provides a mechanism using files called .gitignore that enables you to ignore or just not manage certain files. Why wouldn’t you control all files in the project? Let’s break the project down into categories of files to understand that better.

Source Files

Source files are files that you create or modify to build your application. These would be html, css, JavaScript or other types of files that contain the guts of your application. You definitely want to control these files because if you were to lose them, you would lose functionality. A good example is all the content within the common folder which would be the code for the hybrid application you are building. That’s fairly straightforward.

Derived Files

The IBM documentation doesn’t use the term “derived” file. That’s my own term for files that are produced or generated by MFP when you perform an operation such as build or deploy. The www folder within the native folder of each environment is a good example. When developing a hybrid application, MFP enables you to create code once, but use it in all your environments. The tools used to build the app (such as the SDK for Android or Xcode for iOS environments) might require all hybrid content to exist within the structure of the native folder. The hybrid content from the MFP common folder is copied to the environment’s www folder during the build phase. Therefore, there is no benefit to managing the content of the www folders within the environments. That content just gets overwritten with each new build. If you accidentally deleted it, nothing is lost.  It is best to just ignore all these files and not put them under SCM control.

Derived but Required Files

Things get a little trickier when you start considering how you will share your project with others. There is a category of files whose contents are derived, but MFP only generates them when the environment is first created. The files are not generated or copied with each build or deploy. However, these files are required to exist by the build tools such as Xcode. If you don’t include them in SCM and your teammate grabs a copy of your code, the project won’t build without them.

Let’s look at an example. I am building a hybrid app named mySampleApp in an MFP project named mySampleProject that will run on iPad. When I first create the iPad environment, MFP creates the file mySampleProjectmySampleAppIpad-Info.plist in the iPad native folder. This file contains, among other things, the app version number string. If I change the app version number in the application-descriptor.xml file and rebuild, the *Info.plist file is changed and I have a new version to check into in SCM. I would like to avoid that since the version number in the *Info.plist file is derived. But, if I don’t add the *Info.plist file into my SCM system and my teammate tries to build without this file, the build will fail.  The file is not automatically created by MFP if it doesn’t exist.

This leads you to a bit of a quandary: you don’t want to control these derived files, yet you need to add them to source control in order to maintain a complete, buildable project.

The best approach is to put them under source control and live with the minor consequences. Those consequences are that each time certain changes are made by developers, those files show up as changed. You can often decide to do nothing with the change and you will be fine because the build will regenerate the content within the file.

Gitignore

As mentioned earlier, Git provides a way to specify files you want to ignore. You create one or more .gitignore files and include the list of derived files you don’t want to be put under control. Git supports the notion of multiple .gitignore files spread across the file system, the effects of which are cumulative. As a best practice, I would recommend against this. Keeping a single .gitignore at the root of your MFP project is sufficient and it makes it obvious where to go if you need to make changes to the .gitignore file.

Andrew Ferrier has created a great template .gitignore file (https://github.com/andrewferrier/mfp-gitignore). You can download and use the file freely and he welcomes any feedback or improvements.

Cleaning up the mess of a bad start

You probably won’t get the .gitignore right the first time. That’s fine – you can tweak as you go. But, you need to know that adding a file to .gitignore does not remove it from Git. If the file was already in Git, the .gitignore file has no effect.

The best course of action in this case is to merge everything back into a single branch, fix the .gitignore file, then remove files from source control as needed. Andrew Ferrier has also created a tool that will help you automate the task (https://github.com/andrewferrier/git-utilities/blob/master/gitaliases ).

An effective branching model

Once you have your code in Git, you need to decide how you are going to work with it. Git is very flexible so you really need to establish some conventions and procedures so your entire team knows what is going on.

Branching and tagging conventions are probably the most important things to agree upon. Again, there are many ways you could do this, but one of the most widely accepted models is one proposed by Vincent Driessen several years ago (http://nvie.com/posts/a-successful-git-branching-model/ ). This model may be more than you need on a small project, but there are lots of good ideas here you can glean.

Daniel Kummer has created some Git extensions that automate Driessen’s model (http://danielkummer.github.io/git-flow-cheatsheet/ ). This is nothing you couldn’t do through adhering to your established conventions, but it helps automate and enforce it.

In the next post, I will walk through an example using IBM DevOps Services as the master Git repository for the project.

Sneak Peek – Mobile Quality. First Impressions Count

Javier Bastante of Keynote Systems and I will be presenting a session called “Mobile Quality. First Impressions Count” next week at IBM Innovate.  We are going to talk about a number of things, but one area we will address in particular is how you can combine the automated testing capabilities of IBM Rational Test Workbench with the Keynote DeviceAnywhere mobile device cloud to really increase your platform coverage and decrease your time to test.  We plan to demo this live.  Test Workbench is on a Windows machine hosted on the SoftLayer cloud.  The devices we will use to test on are hosted on the DeviceAnywhere cloud.

I have full confidence in both of these solutions.  However, I have been burned by poor hotel conference room network connections before.  To alleviate that risk and reduce my stress level, I prerecorded a shortened version of the demo we will show.  Just in case.  Who knows. Better safe than sorry.  So, I have a backup plan.

While I had this nice video on my laptop anyway, I thought, “Why not share it with our audience viewing at home?”  So I get a backup plan and you get a sneak peek at the demo we will show at the conference.

And then I thought, “Maybe someone that had not planned on coming to our session will see the video on YouTube, become interested and decide to attend our session.”  So I get a backup plan, you get a sneak peak and I get a shameless plug for my session and might possibly be able to drive up attendance.  Hmmm, seems like I am getting the better end of this deal, but I hope you find the video informative anyway.  Oh, and don’t be afraid to drop in on our session.  Tuesday, 11:15AM.  Dolphin-Northern A4.

IBM Innovate is just around the corner

Innovate2014Logo

The IBM Innovate conference is one of the highlights of my year.  Because I work remotely as do many of my peers, Innovate is one of the few chances I get to see some of my teammates face to face.  Whether it is sharing lunch at the conference, a couple IPAs at the brew pub or a tin of Queso dip and chips in Picabo at 2am after the theme park (I’m somewhat embarrassed to say this has become an annual tradition), it’s great to catch up with old friends.

But, more importantly, Innovate is an opportunity to get face time with lots of our customers and partners.  I travel around and visit customers all the time, but Innovate gives a great bang for the buck as I can talk to an electronics manufacturer over breakfast, meet folks from a fascinating new start-up at a workshop,have dinner with a multinational banking customer and have a beer with a business partner all in the same day.   The electronics manufacturer is looking at how they can deploy Rational Quality Manager across their entire enterprise for thousands of users and is concerned about how they will transition from their legacy test management solution.  The start-up is strapped for cash and needs to know how they could automate their continuous build process while integrating with open source SCM and testing tools.  The bankers are positioning themselves to capture the huge market opportunity for mobile banking in sub-Sahara Africa.  The business partner (although he primarily just likes to drink beer) wants to see how his firm can partner with IBM to deliver more value to clients.  There is a huge diversity of interests, challenges and solutions being discussed at Innovate.  It is a geekfest and the energy in the sessions, keynotes and workshops is electric.

As a Solution Architect, all these diverse customers are all generally looking for me to suggest what IBM products and solutions will help them – and that’s the fun part of my job.  The dirty little secret is that I generally get far more out of those conversations than they do.  I come away from these conversations with new insights and perspectives on what people are doing today with our tools and what they want to do tomorrow.  That electronics manufacturer needs more capabilities in the RQM REST interface.  That start-up is using an automated test framework I’ve never heard of before.  Those bankers are testing and deploying their mobile apps on a two-week cycle.  That business partner wants RTW to run on their mobile device cloud.  These are all fascinating new challenges to go back and discuss with our developers, product managers and services folks.  Addressing them is what sets us apart from the competition.  It is also what keeps things interesting and challenging.

So, give me a shout if you are at Innovate.  I would love to chat it up with you and see what your next big challenge is.  Reach out to me on the Innovate app (well, one hasn’t been announced yet, but I am sure there will be one).

Here’s my presentation schedule.  I would love to see you at one or more of my sessions! Rumor has it that they will be giving a way a new car at one of my sessions.  (OK, that’s not true.  I only said that to drive up attendance.  But come anyway.  I’m sure we will have fun!)

IMD-1202 : Mobile Quality. First Impressions Count.
Date/Time : Tue, 03-Jun, 11:15 AM-11:45 AM
Room : Dolphin-Northern A4
Co-presenter : Javier Bastante, Keynote Systems

WRK-2391 : Quality Management and Testing Mobile Apps
Date/Time : Wed, 04-Jun, 10:00 AM-01:00 PM
Room : Dolphin-Northern A2

DQM-1198 : Avoiding the Dreaded “One Star” Review: Automating Mobile Testing to Improve App Quality
Date/Time : Wed, 04-Jun, 01:45 PM-02:45 PM
Room : Dolphin-Northern D