IPad as main computer on the go. – Lost the lag.

I’m still on an accessibility sub topic in all of this because it’s the accessibility of apps or work arounds that will ultimately make this useable or not. I have no doubt that if a sighted system administrator decided tomorrow to use an iPad for everything, she or he would find it much easier because there’s a larger array of apps to choose from. Hell, VNC would even be possible if the user was absolutely mad enough to go down that route. I have nothing against VNC. But I prefer the integration of RDP in the Windows world. So, tonight, I”m writing about another accessibility related thing. But this time, it’s really posative.

I’ve lost the lag! Screen readers in Windows have become so much better in recent years. They are better at memory management, they have much more efficient virtual buffers, scan mode, browse mode or what ever they call it, they are just generally more reliable, faster and more stable than ever before. But the reality that they need to contend with is they are bolt on. So in some situations, they are absolutely dependent on Windows applications working in a nice way. Take Outlook as an example. This is a very efficient and accessible application for a screen reader user. But there’s no getting around it. In the default layout, when you arrow around messages, that message opens up in the preview window and for just under a second while that message opens, the screen reader sits around picking it’s nose or something. It’s really frustrating. If you throw enough CPU and RAM at it, the process seems to be hugely improved but good luck asking your employer for a quod core processor and 16 to 32GB of ram just because your screen reader is a little more responsive with the extra resources. In SCCM the same thing happens. When you arrow around the tree view, the main view opens for each item. So if you go past the security compliance view or the Office365 management option you could be waiting for five to ten seconds before the next item in the list can be reached. It’s really annoying! But with IOS, voiceover is built in. As some would say, it’s baked in. Either way, it means the same thing. IOS is built with Voiceover in mind so the operating system allows for accessibility use cases that were not considered when the Windows interface was developed many moons ago. So in IOS, you can very quickly move between Email without opening each item. You can jump over to attachments with just a swipe and there’s no lag. You can alt tab to another application and there’s absolutely no lag. You can have split screen on the iPad and jump between notes and your to do window without even considering it. It’s weird. My entire method of taking notes and tracking my to do list in meetings has changed because I’m unlearning a habit that I’ve had for years where I would try to stay in the one window for as long as possible to save vital milliseconds that would be lost by the screen reader lagging slightly behind. Same too when browsing the web and filling in forms. I don’t need to wait for the fraction of a second when the page lodes and I know that I’m in the edit area of a form before I start typing. As soon as I know that Voiceover indicates that the edit field has focus, I know with certainty that I can just start typing and it will work flawlessly.

This seems like I”m putting Windows screen readers and / or Microsoft down. I’m not. Not at all. Windows screen readers and Microsoft Windows and associated applications are infinitly more powerful and versatile than this iPad. But because IOS has been designed with this hardware and app platform in mind, it can gain massive boosts in speed and efficiency.

If you aren’t a screen reader, you would be forgiven for being skeptical of my description of saving milliseconds. But consider that when you see something, you can act on that right away. You don’t give it a second thought. I don’t understand what goes on in our minds of course but I think it’s very much the same for people who can’t see when it comes to audible queues. When I hear a beep, a ding, or the screen reader say something, I don’t think about the next action, I just do it. I spend about 70 hours a week on various electronic devices for work and other things. So interacting with a screen reader is as natural to me as looking out the window is for you.

It’s important that I pepper some posativity into this series of articles because I’ve been quite harsh in many areas of my review over the past few days. I wanted you to be aware that actually, I’m really finding this to be a great replacement for my laptop..

Soon, I’ll write about audio and video editing. I’m effectively doing this now. Is it efficient, or as efficient as on Windows? You should come back soon to find out.

Thanks for sharing these posts on social media. Drop me a comment though. At minimum it looks really bad when these posts are getting lots of visitors and now comments but really, I just want to hear what you think about this series. Is there something you want to read about in particular?

IPad as main computer on the go. – Remote desktop. USB pass through

My thinking was that what if I could get a USB C to USB A adapter then connect a Jaws authorization dongle into it and remotely authorize Jaws on a system that I was connecting to using remote desktop.

I also thought it could be handy to use my USB key that has NVDA and a few other useful utilities on it.

But unfortunately this was not to be. The remote desktop client made by Microsoft for the iPad doesn’t seem to send USB storage through to remote desktops.

This is another unfortunately a big blow to my use of the iPad. But I have other methods of getting around these limitations. So the exploration of this platform as my computer while on the go is continuing. Certainly for meetings and for reading / writing on the way to and from work has been great with this very small device.

IPad as main computer on the go. – Remote desktop.

I’m a system administrator. So regularly, I need to log on to servers. Primarily this is done through the remote desktop protocol (RDP) for short. When I first thought of using an iPad as my main computer when on the go, my main thought was, how am I going to connect into a server if it get’s into trouble when I’m on the road. Fortunately microsoft have a native remote desktop solution for the iPad called R D Client. This app is really well thought out. It supports saving credentials and connections. It even has workspaces to separate out the connections. This is great because I don’t just have remote desktop connections for work. I also have them for other companies.

The question you will have is how do I use assistive software such as Jaws on a remote machine? Well, it’s very simple. The sound is sent from the remote system to my iPad just like standard system sounds are. Latency seems exceptionally low. Even when connecting to machines on remote networks.

Remote networks brings me to a great point. I’m really glad that I’ve been able to find several VPN clients for the iPad to enable me to connect to the various networks that I need access to. So, yes. Even on the iPad, I can connect to everything I need. There’s one draw back though. I seem to have a problem connecting to some VPN’s when I’m using the iPhone as an access point for the iPad. I will investigate this further.

Remote desktop connections are quite efficient when on the iPad and with the iPad I can certainly log in and poke around. However, for any kind of serious work, I find that I need a keyboard with propper control, alt and F keys. I have some more research to do on this topic. I have ordered a USB C to USB A cable so that I can try different keyboard combinations. There are other major limitations that I’m hoping to get around.

The major limitation at the moment is the caps lock key is not sent to the remote machine. Therefore as Jaws uses this as a modifier, I am unable to issue Jaws specific commands. I’m hoping to get around this with a propper Windows keyboard. You can expect me to write more regarding this topic in the coming days.

Over all though, I’m happy with this solution. I haven’t had to do anything on a Windows server in an emergency and that’s when I”ll really tell if this is a viable solution for on the go.

IPad as main computer on the go. – Working while traveling.

I was going to write about remote desktop access today but I have more research to do before I can write on that topic with any authority. I’m instead going to give a brief note on using the iPad pro in place of a full size laptop while on the bus or train.

The surface book 2 and the XPS roughly occupy the same area on my lap. In terms of typing placement, they both promote similar hand placement. With the surface book 2 however the weight is distributed at the top. This to me led the laptop to feel shaky. The XPS is a 16 by 9 ratio screen so the top of the screen didn’t stick up as much meaningit could be used in more confined spaces where the seat in front jutted out near the top. The Surface book was never really that comfortable to use on a bus or a train.

The iPad pro with keyboard isn’t probably going to be a runner for indepth work when on the bus. Certainly when using the keyboard. For comfortable hand placement, I like to have the device sitting more toward my knees. The angle that the screen then sits at by default is therefore hampered by the seat in front of me. The angle of the seat pushes against the iPad causing the magnetised case away. This is really unfortunate. I tend to do a lot of reading on the bus. I find that dissuing navigational commands through the keyboard is way more efficient often than even using the touch screen so I’m disappointed that the form factor doesn’t work as I had hoped.

On the train though, there is more space. Depending on the seat you choose, you can either grab a table and place the iPad on it or if you have a seat without a table, the iPad is perfectly stable on your knee.

Mary Joe Foley, a journalist that covers all things Microsoft related talks about the concept of “Lapability”. It’s the idea that a laptop will sit on your knee without feeling off balance even when your hands are not on the keyboard. I’m really relieved that the Surface book is lapable. I’ve had the Surface book 1 then the version 2 of that laptop for a few years now. I’ve missed that idea that a laptop should be lapable. I don’t find my self keeping a firm grip of this all the time when traveling. It comfortably sits on my lap. Also, because there’s absolutely 0 heat out of this it’s very comfortable to use.

That’s a long post about traveling with these devices but traveling is something I do a lot of. Using it in these situations is very important to me.

IPad as main computer on the go. – Meetings and note taking

I Wrote last night about using the iPad as my primary mobile device when on the go. So let’s continue that. I”m going to focus on productivity during a meeting in this post.

I had a meeting scheduled for first thing this morning. In meetings, I tend to use a few different applications. I write notes, I track to do items using the Microsoft To do app and I refer to Email regularly when tracking conversations that were held outside the meeting.

This is all really easily done using the iPad. I used split view to keep notes and To do on the same screen. Then I used the Outlook client to go through Email. The search tab makes this really in Outlook. And unlike in Windows, there’s no lag when searching and there are no weird keyboard commands and tab sequences to jump through. Overall, I foundthis to be a much more efficient way of managing data in and out during that verylong meeting.

It was necessary to pull from information that spans over a year since this particular project was started. I had files stored in Onedrive but again, finding them was easy with the search feature. The fact that One drive also surfaces recently used files to make it easier to find things quickly that you had been working on made it much faster to get what I needed and really gave me a nice sense of having prepared for this meeting because everything I needed was at my finger tips.

The iPad keyboard has also continued to impressed me. Not necessarily the keyboard although, yes, I like it quite a bit, but more that the iPad and app keyboard commands are great. When in Outlook for example, command n creates a new message. Command enter sends it. It really feels like I’m not loosing my desktop shortcuts.

SNext, I’ll write a little about my first attempts at using Remote desktop. But I have a few other things to try first.

Changing my primary mobile device to an iPad.

It’s a new decade and it seems that I’m marking it with a really new way of working. I decided back in December that I was going to trade in my trusty Surface book 2 and go for something more portable. I love that device. It’s powerful, the keyboard is particularly comfortable and the battery life is exceptional. I have happily left home and attended a full conference without even considering the need to bring a plug. With standard note taking, checking Email and talking with the office over IM, the laptop seems to be happy to work well over ten hours without a charge.

But it was too big. And let’s face it, too powerful for my day to day needs. I ocasionally jump into some serious development in Visual studio but that’s the exception, not the norm. I certainly go in and out of remote desktops a lot and I use a lot of administrative tools but overall, I used the Surfacebook for writing. Be it in Microsoft word, Emails, instant messaging, blog posts, technical documentation etc, it was primarily a device that I used for some kind of day to day stand by machine. The main worker at home is my destop which is a beast of a machine. In work, I have a reasonably powerful laptop and desktop so really, the Surface Book 2 just wasn’t getting to stretch it’s legs often enough.

So I’ve decided without much deep consideration to give the iPad pro a shot. My priorities this year have changed a little compared to last year. I intend to create more video and audio content and I don’t want to be teathered to my desk so this little device is kind of perfect, I think, for what I’m going to attempt.

Of course, I need something to write on regularly so a device with a keyboard is vital but fortunately, the iPad pro comes with a really nifty little keyboard and the typing experience, although not as nice as on the Surface book 2, is certainly passable. However, let me confirm that in a week or two when I’ve spent a few dozen hours typing things out.

One thing that concerns me is the lack of great spell checking facilities. My spelling is absolutely terrible. I’m not sure how I’m going to cope without that. For example, I deliberately misspelled the word cope there and tried to figure out if it was incorrect. The only way I could figure it out was to read word by word. There is a popup that may be displayed given a certain situation however I haven’t been able to trigger that reliably yet. I should probably read a manual. I’m learning the more complicated commands by trial and error at the moment.

This is just a short blog post to get started with authoring on the iPad. Let’s see how it goes.

So far, reading the content back on the iPad has certainly been accessible. That’s really nice as a year or two ago, the WordPress app wasn’t very accessible when trying to read text in edit areas.

HGV convoy. In aid of Fionn’s parade of lights.

Last time I publish a post I explained how I was getting into videography. Here’s the first attemp in a while. Although, most of the videoing credit must go to my wife.

We were in a trucking convoy earlier. In was in aid of a charity initiative called Fionn’s parade of lights. here’s the video.

The night before Christmas – 2019

You might wonder.  Why, on one of the busyist night sof the year would you record such a complicated podcast and edit it before Christmas day even starts.  well the reason is quite simple.  I can’t look back on pictures.  These podcasts serve as my memories and as a way of tracking how my children have interacted with Christmas over the years.  It’s amazingly good fun to listen back over these together.

There are family members abroad as well who I think will get some enjoyment out of these recordings.

To everyone who continues to visit this website, please let me wish you a very happy Christmas and a sucessful and rewarding new year.

Videoing techniques

Since the last blog regarding video blogging or vlogging, I’ve invested in more equipment and I’ve researched a lot of techniques as well. Here are a few videos that I started off with.

Useful learning resources

  • Movements for cinematic shots using a gimbal.
  • Tips for shooting video with a phone. This also gave a basic introduction to B roll.
  •  How to plan and record B Roll. This person has some fantastic tutorials. I’ve learned a lot from him but I haven’t figured out how to implement it yet in my situation.

The problem is now that I have a beginners understanding of how to frame shots and what kind of angles, transitions and lenses are best in some situations however, I still have the problem of actually knowing if the shot I have taken is good or not.  I’ve resorted to using Aira to help me frame a shot.  I may then even call up again to ask them to take a look at the footage.  But I admit, that’s not a great use of my Aira minutes and also the Aira agents try to be very posative but I need less posativity and more honest opinions to figure out if what I have done is any decent or not.

The Kit

Aside from learning, I’ve also invested in some new kit.  Christmas was good to me as well.  My wife bought me a very nice Shure MV88+ microphone.  This tiny little microphone sounds great and comes with great clips for the phone and the gimbal.

Onto the Gimbal.  I have had a cheap gimbal for about a year now but I’ve recently upgraded to the Freefly Movi Cinema robot.  This provides way more success when shooting stable videos and the feature set is much more professional than that of the previous gimbal.  It’s pricy though even on it’s own but when combined with everything else that’s needed to make this work effectively, it’s eye wateringly and stupidly expensive.

The last bit of kit I invested in recently was a 18mm lens from a company called Moment.  This lense attaches to a case for the iPhone to nearly double the depth of view on the phone.  It’s a really powerful addition and from what I’ve been told by people who have commented on videos that I’ve posted, this really helps make video shot on the phone look much more professional.

Here’s how it all looks when it’s together:

Picture of the iPhone attached to a gimbal with a camera on the top. It looks very professional

But nothing is ever simple. Take a look at the list below to see how many extra parts were needed just to make all this fit together.

I am not receiving anything for mentioning any of these products. Because if I was, I would point out that the fact that Freefly and Moment have products that work so well together and have in fact careful attention has been given so that their designs complement each other but yet, when adding a moment lens, there are two other rather pricey components that need to be added.  The lens on it’ sown is pricey enough.  Adding the counterweights and the case just makes me feel cheated.

This post is long enough. I’ll explain more about how I’m using all of this in a future post. For now though, take a look at a video that was captured using all of this.

Blind videographer

I’m really trying to complicate life for myself these days. But the art of videography really interests me.  In this post I’m going to tell you how I have recorded video and how I intend to do it again in a better way in the near future. But first, here are a few reasons why this has captured my attention.  It’s important you know why I’m doing this, so you understand the motivation behind some of the choices.

  • I’m fascinated by the attention span and the combination of science and art that can either retain that attention span or completely lose it. Social media is rotting our attention span. It’s written about in many places but here’s one article on the topic. I got into this more in the past few years because when a video is posted to Facebook, a viewer only needs to watch the video for about 15 seconds before it’ counted towards the number of views that the video has had.  But people who watch the video for longer are more likely to engage with it.  That could be difference between a sale or a passed opportunity. Videos can’t just be interesting or informative, they need to have dynamic visual and audible content that engages the viewer.
  • All of this visual curiosity goes back to the very first books I read on website design twenty years ago. I learned back then about contrast, why you shouldn’t put large moving images on a site, why scroll bars on the home page are a bad idea, why pages behind lots of layers of links aren’t very popular and more.  It because obvious that to make a good website, the visual interaction was way more important than the content during those first few seconds that a visitor landed on the site.  That initial first reaction is now way more important. Every 10 seconds or so, you need to make a brand-new impact to keep a person’s attention because there’s so much content out there, if your video isn’t appealing, there’s another twenty funny cat videos for them to look at.
  • Video is incredibly creative. I really enjoy reading about camera placement. Of course, it’s very easy to get bogged down in the technical side of this but people are all about the angles. And the angles should change regularly to keep the eyes interested and the attention on the topic.  Some movement is also good.  So, it’s kind of the reverse of what applies to website design.  People try to get really creative with angles.  For example, one person whose material I enjoy reading explained in great detail a segment where he was poring water into a glass.  First, he put the camera into the fridge so that when he opened the door, the camera was looking out at him.  Then he put the camera facing down when he put the bottle onto the counter.  So that the video was now taken from the top.  Then he held the camera so that it looked like the video angle was from the perspective of his hand reaching for a glass.  His last segment was then poring the liquid into the glass.  He put the glass on top of the camera and videoed the liquid entering the glass from the perspective of looking up from the bottom.  This very simple activity of taking a bottle out of the fridge, putting it on the counter, getting the glass then filling the glass with the liquid in the bottle was done in four segments.  This made it very dynamic and interesting for someone watching what would otherwise be a very boring task.  This is just an example, but it highlights the freedom and potential for creativity when shooting a video.  The really interesting and fun thing for me is trying to think of ways that might be interesting to capture these different angles.  I haven’t practised this much. And at the moment my ideas are very basic. But with some help, I intend to get a little more complicated.
  • Of course, the technical side of thing interests me. Video was such an unattainable medium to create up to a few years ago. But the technology has become so much more usable and accessible that it’s now within our grasp.

So, how am I going to become in any way proficient in creating something in a medium that is inherently visual? I’m not sure. But What I can say is here’s how I’ve managed to get results so far.

  • Use the iPhone. Voiceover makes the camera interface completely accessible.
  • On the iPhone, take a picture if you aren’t sure that the people you want are in the shot and to make sure there’s enough light. Voiceover should tell you how many faces it sees and if the picture you take is blurry.
  • Take in landscape. E. with the phone on its side. Make sure Voiceover tells you that you are in landscape before you hit the button.
  • Use the volume up or volume down button to start and stop the video recording. It is easier when holding it straight to use these buttons as you aren’t tempted to tilt the screen.
  • Get help learning how the phone feels in your grip when it is straight. In later versions of IOS 13, Voiceover will tell you if you need to tilt left or right.  It then makes a sound when you are straight. But you want to make sure the camera isn’t facing too far up or too far down.
  • If in doubt, ask someone to take the video for you. I’ve rarely been told no when I ask someone to take over. In fact, people are always curious as to how I was going to do it in the first place. If they tell me they aren’t good with cameras, I joke that they there’s no doubt they will be better at it than me.
  • Feel free to record a really quick note at the beginning of your video to say what you are trying to capture along with when and where you are recording. You can easily edit this out later in iMovie.  But leave yourself some time.  Moving along the timeline in iMovie with voiceover on both the Mac and the iPhone is not as exact as it is for someone who can see as the increments are specific with Voiceover.
  • Use a Gimbal. I bought a cheap one a year ago and I’ve never looked back. Really bad pun.  But seriously. The gimbal makes things so much easier.  Just put the phone into the bracket, turn on the gimbal and let it do the rest.  It will keep it straight and facing forward. Depending on the gimbal, you can even use special apps to help you focus on people so that when they move, the gimbal moves along with them.

iMovie is accessible but be prepared for the following work around on the iPhone.  To enable the controls at the bottom of the screen, find the timeline with your finger.  Now with your other hand, turn off voice over.  Do not move your finger.  When voiceover is turned off, tap once.  Now turn Voiceover back on. The controls are now exposed on the bottom.  I do this about 200 times when editing video so it’s a very quick process. But triple tapping and holding doesn’t always work. But this does.  Also, if you have problems setting the play head to a specific location, just listen to the video to the point that you want to cut from then hit pause.  You can then work from that point.

There are a few ways that I aim to improve my set up and my video recording results.

  • I’ve invested in an 18ml lens for the iPhone. Nothing will ever beat the video captured on a professional video camera or even a DSLR, but these methods aren’t accessible. So, a separate lens will do the job nicely. One of the main reasons I wanted a wide-angle lens is because by making the field of view larger, I know that no matter what way I point the camera in, I’m much more likely to get something in the shot.  I’m hoping, and so far this has been accurate enough that even if I don’t get the people centred, that mishap will firstly be forgiven by my audience or looking at this more positively, potentially the off centre shot will help feed that dynamic angle objective that I was talking about earlier.
  • Better gimbal. The gimbal that I’ve been using is basically a very complicated version of a selfy stick. However, this makes retaining certainty in relation to the exact direction the phone is facing in a little difficult at times when it’s busy.  So, the gimbal I have gone with now has the mount on the side and allows for two handed operation for much better awareness of the direction I’m trying to record in.

I’ve spent months now talking with people in person and online about video angles, equipment, lighting and technologies.   I have learned a lot but I’m only scratching the surface.  Some people have been really interested in what I’m trying to achieve so have actively tried to find ways of helping me with methods of ensuring that everyone is in frame.  With all of my shots taken in a wide angle and zoomed perspective in land scape, I’m always going to be the most basic of videographers but that’s fine. I’m really okay with that.  I still think I can have some fun with learning about angles and movement and although my videos won’t be spectacular, I’m going to have some fun learning what does and doesn’t work.