Backstory:
I recently switched web hosts from Digital Ocean to OVH. I decided to go the Docker route and hopefully make migration less of a pain in the future. I run 4 websites off of the host, 3 of which are WordPress and 1 Koken site for photography. I don’t think it was less of a pain to do this, but I had to finish it once I started it.

Disclaimer:
I’m certain that this is NOT the optimal setup, but I could not find anyone who did this so hopefully it’s a jumping off point for someone with more patience and time than me. This is a waste of resources to have 3 separate WordPress Apache docker containers running simultaneously.

Intro:
I chose to run a separate docker container for each website, and use the popular nginx Docker reverse proxy to serve all of them out at different “VIRTUAL_HOST” s. They are not setup for HTTPS yet.

I am going to use the following containers:
2x containers of the wordpress image with a small addition added in for the headers module in a dockerfile.
1x container of koken/koken-lemp
1x container mariadb (or you could use mysql). this will be where all the wordpress containers go to CRUD their databases.
1x container for the nginx reverse proxy

Let’s get to business now! So the first step is to get the custom wordpress container configured so that it will have the custom modules. You can skip this step if you don’t use any extra modules.


##dockerfile start
#taken from https://www.linux.com/learn/how-build-your-own-custom-docker-images
FROM wordpress
MAINTAINER DockerFan version 1.0
ENV DEBIAN_FRONTEND noninteractive
ENV APACHE_RUN_USER www-data
ENV APACHE_RUN_GROUP www-data
ENV APACHE_LOG_DIR /var/log/apache2
ENV APACHE_LOCK_DIR /var/lock/apache2
ENV APACHE_PID_FILE /var/run/apache2.pid
#enable headers
RUN a2enmod headers
##dockerfile end

Save this as dockerfile in a directory, and run
build -t wpapache .

Now you have an image of an up-to-date wordpress, along with a customization baked into it.

Next step: setup a docker-compose.yml file so you can get everything linked and setup with just one command.


##docker-compose.yml start
##run by docker-compose up -d
wpblog1:
image: wpapache
links:
- mysql
environment:
WORDPRESS_DB_NAME: mysqldb1
WORDPRESS_DB_HOST: mysql:3306
WORDPRESS_DB_USER: databaseuser
WORDPRESS_DB_PASSWORD: db-password
VIRTUAL_HOST: 'www.wordpressblog2.com,wordpressblog2.com'mysqldump -urootuserhere -p wordpressblog1 --single-transaction --routines --triggers > /web/mysql/backupwp1.sql
## note that you can have a continuing list of VIRTUAL_HOSTS if you have multiple domains or subdomains that will redirect to this container
ports:
- "127.0.0.1:8081:80"
#- "127.0.0.1:8082:443"
## You can create more ports here, for instance for HTTPS, but I commented it out for now
working_dir: /var/www/html
volumes:
- /web/myblog1:/var/www/html/
wpblog2:
image: wpapache
links:
- mysql
environment:
WORDPRESS_DB_NAME: mysqldb2
WORDPRESS_DB_HOST: mysql:3306
WORDPRESS_DB_USER: databaseuser
WORDPRESS_DB_PASSWORD: db-password
VIRTUAL_HOST: 'www.wordpressblog2.com,wordpressblog2.com'
ports:
- "127.0.0.1:8082:80"
## notice the port is different for local host so that they do not conflict with the wpblog1
working_dir: /var/www/html
volumes:
- /web/myblog2:/var/www/html/
kokensite:
#koken has its own mysql container built in, so there's no need to link to the mysql/mariadb container like the wordpress ones
image: koken/koken-lemp
environment:
VIRTUAL_HOST: 'www.kokensitehere.com,kokensitehere.com,photography,kokensitehere.com'
ports:
- "127.0.0.1:8084:8080"
working_dir: /usr/share/nginx/www
restart: always
volumes:
- /web/kokensitehere:/usr/share/nginx/www
- /web/kokensiteheremysql:/var/lib/mysql
## note the mysql volume is being stored on the local machine, so it should be easier to manage
mysql:
image: mariadb:10.1
ports:
- "127.0.0.1:3306:3306"
#environment:
#- MYSQL_ROOT_PASSWORD=mysqlpasswordhere
#- MYSQL_DATABASE=mysqldatabase
## I commented the above env variables out because they are used to create a new database, and we are just migrating an existing one. still good to know though.
volumes:
- /web/mysql:/var/lib/mysql
#phpmyadmin:
## in case you need to also get into mysql and import your SQL files with a GUI, you should uncomment this whole section.
# image: corbinu/docker-phpmyadmin
# links:
# - mysql
# ports:
# - 8080:80
# restart: always
# environment:
# MYSQL_USERNAME: root
# MYSQL_ROOT_PASSWORD: mysqlpasswordhere
nginx:
image: jwilder/nginx-proxy
ports:
- 80:80
- 443:443
volumes:
- /var/run/docker.sock:/tmp/docker.sock:ro
##docker-compose.yml end

Now that you have docker-compose.yml configured based on your wordpress & koken mysql usernames/passwords and pointed at the location, you just have to run
docker-compose up -d

If all goes well they should be running now on the new server.
If you’re like me and have to edit the docker-compose file and re-run it a lot, a small bash script might help you delete all your Docker containers — WARNING — and recreate from the docker-compose.yml file:

#!/bin/bash
##recreateContainers.sh - delete all docker containers, and run the docker-compose command.
ALL=$(docker ps -a -q)
docker stop $ALL
docker rm $ALL
docker-compose up -d

If you are having issues with the MySQL databases not connecting, you will have to do a mysqldump of the databases from the old mysql instance and import them into the new tables.

mysqldump -urootuserhere -p wordpressblog1 --single-transaction --routines --triggers > /web/mysql/backupwp1.sql

docker exec -i -t mysql_container bash
#now you are in the mysql container
mysql -urootuserhere -p wordpressblog1 < /var/lib/mysql/backup1.sql

Go directly to Customizations page in new window:
javascript:window.open($(‘#crmContentPanel iframe:not([style*=\”visibility: hidden\”])’)[0].contentWindow.Xrm.Page.context.getClientUrl() + “/tools/solution/edit.aspx?id=%7bfd140aaf-4df4-11dd-bd17-0019b9312238%7d#”);

Go to Solutions in new window:
javascript:window.open($(‘#crmContentPanel iframe:not([style*=\”visibility: hidden\”])’)[0].contentWindow.Xrm.Page.context.getClientUrl() + “/main.aspx?Origin=Portal&page=Settings&area=nav_solution”);

Go to Security in new window:
javascript:debugger;window.open($(‘#crmContentPanel iframe:not([style*=\”visibility: hidden\”])’)[0].contentWindow.Xrm.Page.context.getClientUrl() + “/tools/AdminSecurity/adminsecurity_area.aspx”);

Get a Solution’s Dependencies in new window:
javascript: try { var cont = frames[1] || frames[0]; window.open(cont.Xrm.Page.context.getClientUrl() + “/tools/dependency/dependencyviewdialog.aspx?objectid=” + cont.APP_SOLUTION_ID.substring(1,37) + “&objecttype=7100&operationtype=dependenciesforuninstall”); } catch (e) { if (console && console.log) { console.log(“Error occured: ” + e); }}

Get a Record’s ID:
javascript:var thisXrm; for (var i = 0; i<5; i++) { try {var wf = window.frames; if (wf[i] && wf[i].Xrm && wf[i].Xrm.Page && wf[i].Xrm.Page.data != null) {thisXrm = wf[0].Xrm; var id = thisXrm.Page.data.entity.getId(); if (id) { clipboardData.setData(“Text”, id.toString()); alert(id); } else { alert(‘No id yet!’); } }} catch (e) { }}

Get a form’s type code:
javascript: var typeCode = frames[0].Xrm.Page.context.getQueryStringParameters().etc; if (typeCode) { clipboardData.setData(“Text”, typeCode.toString()); alert(typeCode); }

Setting a value on a rollup field is impossible in JavaScript in CRM 2015. You can’t refresh the field individually without clicking the Refresh button — which can be a pain for users if you have several rollup fields to recalculate.

function setValueOnRollup(field, value) {
//unsupported way of putting a value in the calculated field
//DOES NOT update the value in CRM, only on the display
$('#' + field).find('.rollup').find('span')[0].textContent = value;
}

So if you wanted to use it on a form, put setValueOnRollup(‘yourfieldhere’, ‘100.00’)

It’s best used when you have the values you want to show (preferably the real values) which can be retrieved with a query you can make with CRM REST Builder (https://crmrestbuilder.codeplex.com/).

These also might come in handy to manage the rollups and processes:

Process.js – CRM 2013/2015 Call Action, Workflow, or Dialog from JavaScript (https://processjs.codeplex.com/) is a cool tool to run a process from your custom JavaScript.

Dynamics CRM 2016 Workflow Tools (https://msdyncrmworkflowtools.codeplex.com/), specifically the “Force Calculate Rollup Field” feature to update the value in CRM.

If you’ve ever had this error pop up when deleting a managed solution on your CRM 2015 box:

The Main() component cannot be deleted because it is referenced by 1 other components. For a list of referenced components, use the RetrieveDependenciesForDeleteRequest.

You should use this to quickly get the dependency information:

First, go to the solution’s URL (will be like https://CONTOSO.CO/COCO/tools/solution/edit.aspx?id=%7b2BE0D3AD-DF66-4747-8AA0-A5BA16B146D3%7d

Then run this bookmarklet, which I call GetSolutionDependencies:

javascript: try { var cont = frames[1] || frames[0]; window.open(cont.Xrm.Page.context.getClientUrl() + "/tools/dependency/dependencyviewdialog.aspx?objectid=" + cont.APP_SOLUTION_ID.substring(1,37) + "&objecttype=7100&operationtype=dependenciesforuninstall"); } catch (e) { if (console && console.log) { console.log("Error occured: " + e); }}

It will extract the id for that solution, then go to a page in your CRM org to show more details.  Only tested in IE11 so the context may differ of course.

I recently had a hard drive failure while transitioning to an aufs/snapraid configuration using OMV on my Proxmox box.  I ended up losing about 3TB of data, of which 800GB was not backed up on another device in my network. Ouch.  Fortunately I had Crashplan Pro running for the past few years and backing up everything.

I first started using the Crashplan Restore option in my Windows 8 box but was constrained to having it run at between 6 – 10 mbps, which was estimated to take over a week to restore all of the data. I spoke with Crashplan support and they said this was normal as I was sharing the line and resources with other users.  Fine, I thought, but I don’t want to keep my primary Windows computer on for days on end when I could just setup one of my VMs on my headless server to do the work.  I had left it running for a few days already but only received about 200GB of data.

Surprisingly when I setup my OMV box to use Crashplan’s GUI by installing LXDE and the Crashplan application my download speeds skyrocketed to saturate my entire 50mbps line! I was surprised to see over 500GB downloaded in a single day through my OMV box.  I thought I would share this for those who might be restoring a huge quantity of data but complain about slow speeds on their Windows box.  I wasn’t able to resume the existing progress from my Windows box — so I ended up overwriting the 200GB that Crashplan had already downloaded in Window — but the speed more than made up for it.

I dedicated 3 CPUs and 8GB of RAM to my OMV box while it’s doing this and it’s eating up about 30-50% of my CPU while it does this.  Not too shabby!

For more information on restoring from Crashplan Pro — a service which I highly recommend — is here: https://support.code42.com/CrashPlan/Latest/Restoring/Restoring_Files_From_The_CrashPlan_App

Here is OMV (Open Media Vault) information main page: http://www.openmediavault.org/

I’ve been tinkering with Proxmox VE for holding my Linux NAS and media file server (OpenMediaVault [OMV]) in addition to some other Linux containers.  As OMV would be the only VM to access some of these disks I scoured the web for how to add a physical disk to a VM (http://forum.proxmox.com/archive/index.php/t-6192.html).

The problem with using /dev/sdX to pass through is that oftentimes the hard drives will change their X value (sometimes my primary will be on /dev/sda and othertimes /dev/sdg).  I couldn’t find an easy way to point it to the disk itself rather than its symbolic link that didn’t involve messing with udev & udevadm in the Proxmox forums or documentation, but I did discover that you can point QM (in the vm.conf file such as 101.conf, 102.conf, etc.).

Basically, do this in the Proxmox Console or an SSH session:

ls -l /dev/disk/by-id
# Look for the hard drive disk (not the partitions which will be appended with part-1 and so forth).
#In my example I found scsi-SATA_ST5000VN000-1H4_Z111111
#Then go into your vm.conf file (i.e. nano /etc/pve/qemu-server/101.conf) and add it manually to the device type and number that you want to passthrough.
#Here is what you would add to have it on the 6th SCSI device (scsi5)
scsi5: /dev/disk/by-id/scsi-SATA_ST5000VN000-1H4_Z111111
#Save the file and you should now see the disk when you look at the VM in Proxmox.

Such an easy solution but it took me a few hours of messing with udev to figure out.

Here’s a little snippet I authored this morning, before my cup of coffee, to query the CRM SQL database (YourOrg_MSCRM) for a list of web resources in the order of last ModifiedOn:

1
2
3
4
SELECT ModifiedOn, Name, DisplayName, OrganizationIdName, WebResourceType
FROM WebResource --table for web resources
WHERE IsCustomizable = 1 -- just for custom web resources
ORDER BY ModifiedOn DESC

Anders Fjeldstad posted a neat JS shim on GitHub that adds in the ability to make CRM 2013 notifications in CRM 2011.

You can wrap it in a function that calls during your form’s CRM onLoad event.  That’s it!  Then you can make notifications appears easily:

Xrm.Page.ui.setFormNotification('This is an error',1,101); //Error (signified by the 1, with messageId 101)
Xrm.Page.ui.setFormNotification('This is a warning',2,102); //Warning (signified by the 2, with messageId 102)
Xrm.Page.ui.setFormNotification('This is informational',3,103); //Informational message (signified by the 3, with messageId 103)

Removing a notification is the same as in CRM 2013:

Xrm.Page.ui.clearFormNotification(101); //this clears messageId 101
Xrm.Page.ui.clearFormNotification(102); //this clears messageId 102
Xrm.Page.ui.clearFormNotification(103); //this clears messageId 103

Get it here on GitHub.

I found quite a gem today while browsing the Microsoft Developer blogs for help with debugging JavaScript in CRM 2011. It can be a pain to be updating JavaScript in Dynamics CRM 2011 without shelling out extra money for Visual Studio plug-ins to simplify it.  If you’re new to JavaScript in CRM 2011, this video from Marc Schweigert should help get you started with setting up Visual Studio to do the menial work of publishing web resources and debugging them.

Found at EUREKA: F5 debugging of CRM JavaScript web resources

▶ F5 debugging of CRM JavaScript web resources – YouTube.

It’s a challenge to work with thousands of custom fields in JavaScript in CRM 2011. A tool is available in Visual Studio 2010 and up that automatically imports all of the fields into an object, dubbed “Properties” by default from Visual Studio. This allows us to use Intellisense to ensure that we have the correct names for the fields that we are referencing in JavaScript. Unfortunately the tool, which is fantastic, is not free past its 30 day trial.

Enter custom JS Code. I personally like to use a custom form script library that contains developer-specific functionality (which can be referenced by GUID) and is used with custom key declarations (a previous post explains this).

Here’s the function that I call which will output objects that can almost be copy-and-pasted directly into Visual Studio. I’m sure there’s a better way, but this works for me given the minimal effort that went into it.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
function util_listControlstoConsole() {
try {
//Controls
if (confirm('Do you wish to list all controls to the console?')) {
custAlert('////The below are a listing of all detected controls on the form////');
console.log('var Properties = {');
Xrm.Page.ui.controls.forEach(
function (control) {
if (control.getControlType() !== "subgrid" &&
typeof control.getName === "function") {
console.log(control.getName() + ": '" + control.getName() + "', //Label: " + control.getLabel() + " Parent Name: '"
+ control.getParent().getName() + "' Parent Label: '" + control.getParent().getLabel() + "'");
}
}
);
console.log('}; //remove last comma (,) from above when putting in script, note that there may be duplicates (appended with 1)');
}
if (confirm('Do you wish to list all sections/tabs to the console?')) {
//Sections
custAlert('////The below are a listing of all detected controls on the form////');
console.log('var Displays = {');
console.log(' sections: {');
var tabs = Xrm.Page.ui.tabs;
for (var i = 0; i < tabs.getLength() ; i++) {
var tab = tabs.get(i);
var sections = tab.sections;
for (var j = 0; j < sections.getLength() ; j++) {
var section = sections.get(j);
var name = section.getName();
if (name !== null) {
console.log(" " + name.replace(/\W/g, '') + ": '" + name + "', //Label: " + section.getLabel() + " Parent Tab: '"
+ section.getParent().getName() + "'");
}
//console.log(section.getParent().getName() + "." + section.getName() + ": '" + section.getName() + "'");
}
}
console.log(' }, //remove last comma (,) from above when putting in script');
//Tabs
console.log(' tabs: {');
var tabs = Xrm.Page.ui.tabs;
for (var i = 0; i < tabs.getLength() ; i++) {
var tab = tabs.get(i);
var tabName = tabs.get(i).getName();
if (tabName !== null) {
console.log(" " + tabName.replace(/\W/g, '') + ": '" + tabName + "', //Label: " + tab.getLabel());
}
}
console.log(' }//remove last , from above');
console.log('}; ');
}
} catch (e) {
//alert
}
}

The output will be something like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
var Properties = {
firstname: 'firstname', //Label: First Name Parent Name: 'name' Parent Label: 'Name'
middlename: 'middlename', //Label: Middle Name Parent Name: 'name' Parent Label: 'Name'
lastname: 'lastname', //Label: Last Name Parent Name: 'name' Parent Label: 'Name'
suffix: 'suffix', //Label: Suffix Parent Name: 'name' Parent Label: 'Name'
}; //remove last , from above
var Displays = {
sections: {
name: 'name', //Label: Name Parent Tab: 'general'
}, //remove last , from above
tabs: {
general: 'general', //Label: General
}//remove last , from above
};

Pros:
Now when you are referencing a control, you can put Properties.firstname and have your entry checked by Intellisense rather than relying on typing it yourself. Also if you want to reference a section or tab, you can use Displays.sections.name or Displays.tabs.general. It may be a tad more wordy than just saying ‘general’, but it makes it easier to maintain and error-check.

Cons:
I’d recommend you make changes to the script to get it to work the way you want it. Right now it spits out an extra comma on the last control/section/tab so you have to manually remove it when inputting it into Visual Studio.