The Smile-IT Blog » August 2014

Monthly Archives: August 2014

Are you outdated?

The Gartner Hype Cycle 2014 special report is out

So – here it is: Gartner’s assessment of emerging technologies for 2014. And it’s the first time in years, that I haven’t really anything substantial to requery with it. However, two things are worth mentioning:

Cloud Computing’s disillusionment

It’s “The End of the Cloud as we know it“, I said, recently. Gartner – in quite a similar way – sees The Cloud entering the trough of disillusionment with many signs of fatigue, partly accompanied by rampant “cloud washing” but also driven by many – if not all – vendors offering a Cloud Strategy although “many aren’t cloud-centric and some of their cloud strategies are in name only“. The early promises of massive cost savings are finally worn out for the benefit of more realistic advantages with a move into the cloud. And Gartner appreciates that Cloud continues to be one of the most hyped topics in IT history with organizations that develop a true cloud strategy focussing on the real benefits such as agility, speed, time to market and innovation.

Journey into the Digital Age

However, what’s far more important and interesting than the Hype Cycle itself is their publication of the “Journey into the Digital Age” which comes – according to Gartner – with 6 business era models. These models – alongside their respective driving technologies – characterize the focus and outcome of organizations operating within each of those eras. Dividing lines between them are

  • the “Web” before which the only relevant era was “Analog” characterized by CRM and ERP as the most important emerging technologies and
  • the “Nexus of Forces” (mobile, social, cloud and information) which seperates “Web” (as an era), “E-Business” and “Digital Marketing” from “Digital Business” and “Autonomous”

While the era of “Digital Marketing” is mostly what we see with innovative organizations these days, it is the last 2 eras that seperate the latter from the real innovators and the founders of the next age of IT (claimed by many to be called “Industry 4.0”):

  • Digital Business – mainly driven by how the “Internet of Things” changes the way to do business and to interact with customers – will be the era where our physical and virtual world will blur and businesses will adopt and mature technologies like 3D printing/scanning, sensor- or machine-to-machine-technologies or even cryptocurrencies (e.g. BitCoin). We should be watching out for the main innovators in the healthcare domain to show us the way into and through this era within the next few years.
  • Autonomous – to me – is the most compelling of those 6 business era models. According to Gartner it represents the final postnexus stage (which i.m.h.o. will change as evolution is ubiquitous and change is constant) and is characterized by organizations’ ability to “leverage technologies that provide humanlike or humanreplacing capabilities“. Enterprises having the capabilities to operate within this business era model will push innovative solutions of all kind, that allow normal day-2-day activity like driving cars, writing texts, understanding languages, assisting each other, … an automated – an autonomous – task.

When writing “Innovation doesn’t happen in IT” last year around the same time, I was overwhelmed by the fact, that we’re commencing to leave an age where IT was to be a discipline in itself. It is in these days, that we sense an even stronger move into IT being ubiquitous, the nexus of forces being felt in our every-day lifes and IT becoming servant of what’s really important.

I’m hoping for it being a humble servant!

 

(download the full Gartner Hype Cycle of Emerging Technologies Report here)

Published by:

How to Recover from Enterprise Vault

… that moment, when in the cloud – in a real one; i.e.: in a plane somewhere over an ocean – and you eventually got nothing else to do than reading those loads of docs you dropped into your mailbox for later use … that – very – moment … when your enterprise’s archiver kicks in and Outlook tells you it can’t load your eMail as you are – guess what? – OFFLINE!

Here’s what I did.

 

Why?

Enterprise Vault is a great archiving solution. It integrates pretty seamlessly with Outlook. You don’t realize any difference in accessing eMails whether they’re meanwhile archived or not. There’s however a difference: Once Vault has gotten hold of one of your eMails, all you really have in your folders is in essence a torso of 300 chars embedded with a link to the respective Vault item of your eMail.

And now, there’s those occasions when you want to access exactly those old eMails that Vault has long ago grasped; also when offline; and – honestly: PST is not such a bad concept (while I indeed do appreciate companies’ aim to reduce (restrict) PST usage). Anyway. I spent some thought around this recently and ultimately created a solution which works perfectly for me and now lets me access all my old mail again – through a PST folder.

This one’s to explain how that solution works:

 

The Solution

is a simple Outlook VBA codepiece grabbing any vaulted eMail, opening it and copying it to a respective PST folder. Once opened and copied (the “copy” is key) it loses its vault link and gets its entire content back.

 

1: Search vaulted eMails

First of all, I defined an Outlook Search Folder to grab all vaulted eMails. This can be done by querying the .MessageClass field:

Vault-blog-1I went by the Search Folder idea as otherwise I’d have to walk through all eMails to find the vaulted ones. BTW: On vaulted eMails the MessageClass field reads “IPM.Note.EnterpriseVault.Shortcut” in its entirety.

2: Folder structure

I then wanted to replicate my folder tree in the target PST – just … well: just ’cause I’m used to. That’s a little recursion:

Function CreateFolder_Recursive(aRootFolder As Outlook.MAPIFolder, aFolder As Outlook.MAPIFolder, bMailOnly As Boolean) _
  As Outlook.MAPIFolder
Dim fldReturn As Outlook.MAPIFolder
Dim itm, itm2 As Object
 For Each itm In aRootFolder.Folders
  If itm.Name = aFolder.Name Then
   Set fldReturn = itm
   For Each itm2 In aFolder.Folders
    Set itm = CreateFolder_Recursive(fldReturn, itm2, bMailOnly)
   Next itm2
   Exit For
  End If
 Next itm
 If fldReturn Is Nothing Then
 ' create the folder only if it is a mailfolder or if the parameter flag indicates that we shall create all folders
  If aFolder.DefaultItemType = olMailItem Or Not bMailOnly Then
   Set fldReturn = aRootFolder.Folders.Add(aFolder.Name)
  End If
  If Not (fldReturn Is Nothing) Then
   For Each itm2 In aFolder.Folders
    Set itm = CreateFolder_Recursive(fldReturn, itm2, bMailOnly)
   Next itm2
  End If
 End If
End Function

3: Get the search folder to retrieve the vaulted eMails from

Finding the respective search folder is just an iteration over all stores and figuring out the SearchFolder object with the right name.

 On Error Resume Next
 Set colStores = Application.Session.Stores
 For Each oStore In colStores
  Set oSearchFolders = oStore.GetSearchFolders
  For Each oFolder In oSearchFolders
   'Debug.Print (oFolder.FolderPath)
   If Right$(oFolder.FolderPath, Len(aFolderName)) = aFolderName Then
    Set FindMySearchFolder = oFolder
   End If
  Next
 Next

 

4: Finally – the eMail copy routine

That one’s the major piece of it; with every eMail retrieved from the SearchFolder you got to

  • Open it by the MailItem.Display command; this creates an Inspector object
  • Grab the Application.ActiveInspector and from that the Inspector.CurrentItem
  • Once the MailItem is discovered you can copy it: currentItem.Copy. That’s a major step. You could just right away move the item into the target folder in your PST, but that would not void the vault link.
  • Finally – after that copy operation – you can now move the MailItem in the destined target folder (I made sure it is the same as in the original mail store): MailItem.Move targetFolderName
  • After moving, close the item without changes: MailItem.Close olDiscard

With that operation on any of the vaulted eMails they get freed and accessible without vault connection.

 

Now – a few useful hints

for the benefit of your patience:

  • The Outlook forms cache is a tricky beast. As Enterprise Vault uses a bunch of custom forms to handle vaulted eMails, the forms cache is heavily used during this operation. I removed it before execution and also made sure that in case it gets scrambled again forms would be loaded from their original source instead to load’em from the cache. Here’s a few sources on the Outlook forms cache and the ForceFormReload registry key.
  • This still did not allow the macro to execute on all the 1300-something eMails I had to unvault. Ultimately, a simple DoEvents command in the macro’s main loop allowed Outlook to regularly recover from its heavy use of the forms cache.
  • Where to start? I used the namespace method PickFolder and simply chose the right folder to target my eMails to by the dialog it throws up.
  • Deletion after unvault: You might wanna consider deleting any vaulted eMail from your main mail store once it’s been copied to the PST.

So, finally the end result now resides within my Outlook Applicaiton as a VBA routine and lets me regularly unvault and PST-archive my eMail.

Nice .. I think.

 

Published by:

Security is none of my business!

… says the home IT expert. “I’m just not experienced enough to bother with these kind of nightmare”, says he, until malware kicks in …

I’ve gone through a little sweat and threat the past few hours as I wasn’t really able to securely (!) manage my NAS while absent from home and at the same time reading from the SynoLocker ransomware attack.

This isn’t particularly pleasant nor comforting in any way, but luckily it seems that I had at least done a few things right when configuring my gear (my data remained untouched so far). So, I thought to share some thoughts:

1 password – 1 service

The simplest and for sure one of the most silly reasons for being attacked is using some kind of default password (like “12345” or the username) or using one password for many services. This obviously wasn’t particularly the reason for SynoLocker to be able to attack, but still it gave comfort during the time today, when it was still unclear whether the attack is run via a password vulnerability.

I have to admit to gain some confidence in using an easy to remember algorithm which leads to a different password with whatever new service I consume.

No standard ports

For some reason I have this weird habit of changing ports when opening up a new service from the private LAN to the Internet. I never really go with what is suggested to be used, so when configuring my NAS for the first time, I chose to deviate from the given port 5000 and 5001 for the control panel. So far, I cannot be sure, but possibly this action saved me from SynoLocker’s attack.

So I better keep the habit and whenever I do have the possibility, I change any port exposed to that dangerous world outside.

Yes, I appreciate that for certain services one cannot do that as the service consumers – like a client application – rely on fixed predefined ports. Let’s hope, then, that data accessible through this service are just not that important and you got a backup of the same.

Firewall rules are OK

Yes. They are. Especially the ones allowing you to manage your gear within the local private network. I got a rule to allow access from the private subnet

Ports: All - Protocol: All - Source IP: <my subnet IP, 16 bit + subnet mask> - Action: Allow

while at the same time everything not matching an existing rule is allowed. Now, when needing to close down everything I just need to enter the panel, chose to “deny all that doesn’t match a rule” and – BOOM – doors shut. Leaves me without management control form the outside. But – Hey! – that’s minor compared to encrypted data put at ransom. And rather sooner than later I’ll make my way into the private LAN anyway to fix things.

Let the service dictate

My NAS has quite a nice feature – which probably every NAS has anyway: When all is closed down for access from the outside, it tells you what to open for the particular service to work. Hence, I always close the firewall completely when installing or starting a new service. With that – through the respective warning popup – i instantly learn what shall be opened for this service to operate. There’s just a few things to keep in mind:

  • No, you do NOT want to chose the “suppress firewall messages in future” option
  • No, you do NOT want to click OK without investigating the list
  • Yes, you DO want to spend time to figure out how to – here we are again! – change the default ports without disrupting the respective service

And so – to conclude: Here’s what put me under sweat and threat, eventually:

Update!

I didn’t. It turned out that I ran my gear with exactly the version vulnerable to SynoLocker without ever having even thought of checking for updates – without having checked the option to do critical updates automatically in the background.

“Why”, you ask? Honestly, because I tend to think that malfunctioning updates can break stuff as well. Question is only: What is worse? Malfunctioning update or malware injection?

Well – the option is checked now, the above saved me from a nightmare of data loss or at least financial implications and my gear remains closed to the public until further notice. From the vendor.

 

Published by:
%d bloggers like this: