Quantcast
Channel: Freddys Blog
Viewing all 161 articles
Browse latest View live

Connecting to NAV Web Services from the Cloud–part 2 out of 5

$
0
0

If you haven’t already read part 1 you should do so here, before continuing to read this post.

In part 1 I showed how a service reference plus two lines of code:

var client = new Proxy1.ProxyClassClient("NetTcpRelayBinding_IProxyClass");
Console.WriteLine(client.GetCustomerName("10000"));

could extract data from my locally installed NAV from anywhere in the world.

Let’s start by explaining what this does.

The first line instantiates a WCF client class with a parameter pointing to a config section, which is used to describe bindings etc. for the communication.

A closer look at the config file reveals 3 endpoints defined by the service:

<client> 
   <endpoint 
    
address="sb://navdemo.servicebus.windows.net/Proxy1/" 
     binding="netTcpRelayBinding" 
    
bindingConfiguration="NetTcpRelayBinding_IProxyClass"  
     contract="Proxy1.IProxyClass" 
     name="NetTcpRelayBinding_IProxyClass" /> 
   <endpoint 
     address="
https://navdemo.servicebus.windows.net/https/Proxy1/" 
     binding="basicHttpBinding"  
     bindingConfiguration="BasicHttpRelayBinding_IProxyClass"  
     contract="Proxy1.IProxyClass"  
     name="BasicHttpRelayBinding_IProxyClass" /> 
   <endpoint 
     address=
http://freddyk-appfabr:7050/Proxy1 
     binding="basicHttpBinding"  
     bindingConfiguration="BasicHttpBinding_IProxyClass" 
     contract="Proxy1.IProxyClass"  
     name="BasicHttpBinding_IProxyClass" />
</client>

The endpoint we use in the sample above is the first one, using the binding called netTcpRelayBinding and the binding configuration NetTcpRelayBinding_IProxyClass, defined in the config file like:

<netTcpRelayBinding> 
  <binding 
    name="NetTcpRelayBinding_IProxyClass" 
    closeTimeout="00:01:00" 
    openTimeout="00:01:00" 
    receiveTimeout="00:10:00" 
    sendTimeout="00:01:00" 
    transferMode="Buffered" 
    connectionMode="Relayed" 
    listenBacklog="10" 
    maxBufferPoolSize="524288" 
    maxBufferSize="65536" 
    maxConnections="10" 
    maxReceivedMessageSize="65536"> 
    <readerQuotas  
      maxDepth="32" 
     
maxStringContentLength="8192"  
      maxArrayLength="16384" 
      maxBytesPerRead="4096" 
      maxNameTableCharCount="16384" /> 
    <reliableSession  
      ordered="true" 
      inactivityTimeout="00:10:00" 
      enabled="false" /> 
    <security 
      mode="Transport"  
      relayClientAuthenticationType="None"> 
      <transport protectionLevel="EncryptAndSign" /> 
      <message clientCredentialType="Windows" /> 
    </security> 
  </binding>
</netTcpRelayBinding>

If we want to avoid using the config file for creating the client, we can also create the bindings manually. The code would look like:

var binding = new NetTcpRelayBinding(EndToEndSecurityMode.Transport, RelayClientAuthenticationType.None);
var endpoint = new EndpointAddress("sb://navdemo.servicebus.windows.net/Proxy1/");
var client = new Proxy1.ProxyClassClient(binding, endpoint);
Console.WriteLine(client.GetCustomerName("10000"));

In order to do this, you need to have the Windows Azure AppFabric SDK installed (can be found here), AND you need to set the Target Framework of the Project to .NET Framework 4 (NOT the .NET Framework 4 Client Profile, which is the default). After having done this, you can now add a reference and a using statement to Microsoft.Servicebus.

You can also use .Net Framework 3.5 if you like.

What are the two other endpoints?

As you might have noticed, the config file listed 3 endpoints.

The first endpoint uses the servicebus protocol (sb://) and connecting to this endpoint requires the NetTcpRelayBinding which again requires the Microsoft.ServiceBus.dll to be present.

The second endpoint uses the https protocol (https://) and a consumer can connect to this using the BasicHttpRelayBinding (from the Microsoft.ServiceBus.dll) or the standard BasicHttpBinding (which is part of System.ServiceModel.dll).

The last endpoint uses the http protocol (http://) and is a local endpoint on the machine hosting the service (used primarily for development purposes). If this endpoint should be reachable from outside Microsoft Corporate network, I would have to ask Corporate IT to setup firewall rules and open up a specific port for my machine – basically all of the things, that the Servicebus saves me from doing.

BasicHttpRelayBinding

Like NetTcpRelayBinding, this binding is also defined in the Microsoft.Servicebus.dll and if we were to write code using this binding to access our Proxy1 – it would look like this:

var binding = new BasicHttpRelayBinding(EndToEndBasicHttpSecurityMode.Transport, RelayClientAuthenticationType.None);
var endpoint = new EndpointAddress("
https://navdemo.servicebus.windows.net/https/Proxy1/");
var client = new Proxy1.ProxyClassClient(binding, endpoint);
Console.WriteLine(client.GetCustomerName("10000"));

Again – this sample would require the Windows Azure AppFabric SDK to be installed on the machine connecting.

How to connect without the Windows Azure AppFabric SDK

As I stated in the first post, it is possible to connect to my Azure hosted NAV Web Services using standard binding and not require Windows Azure AppFabric SDK to be installed on the machine running the application.

You still need the SDK on the developer machine if you need to create a reference to the metadata endpoint in the cloud, but then again that could be done on another computer if necessary.

The “secret” is to connect using standard BasicHttpBinding – like this:

var binding = new BasicHttpBinding(BasicHttpSecurityMode.Transport);
var endpoint = new EndpointAddress("
https://navdemo.servicebus.windows.net/https/Proxy1/");
var client = new Proxy1.ProxyClassClient(binding, endpoint);
Console.WriteLine(client.GetCustomerName("10000"));

Or, using the config file:

var client = new Proxy1.ProxyClassClient("BasicHttpRelayBinding_IProxyClass");
Console.WriteLine(client.GetCustomerName("10000"));

Note that the name refers to BasicHttpRelayBinding, but the section refers to BasicHttpBinding:

<bindings>
   <basicHttpBinding>
     <binding name="BasicHttpRelayBinding_IProxyClass" …

and the section determines what class gets instantiated – in this case a standard BasicHttpBinding which is available in System.ServiceModel of the .net framework.

Why would you ever use any of the Relay Bindings then?

One reason why you might want to use the Relay bindings is, that they support a first level of authentication (RelayClientAuthenticationType). As a publisher of services in the cloud, you can secure those with an authentication token, which users will need to provide before they ever are allowed through to your proxy hosting the service.

Note though that currently, a number of platforms doesn’t support the Relay bindings out of the box (Windows Phone 7 being one) and for this reason I don’t use that.

In fact the service I have provided for getting customer names is totally unsecure and everybody can connect to that one, provide a customer no and get back a customer name. I will discuss authentication more deeply in part 4 of this post series.

Furthermore the NetTcpRelayBinding supports a hybrid connectionmode, which allows the connection between server and client to start out relayed through the cloud and made directly between the two parties if possible. In my case, I know that my NAV Web Services proxy is not available for anybody on the outside, meaning that a direct connection is not possible so why fool ourself.

Connecting to Proxy1 from Microsoft Dynamics NAV 2009 R2

In order to connect to our Cloud hosted service from NAV we need to create a DLL containing the proxy classes (like what Visual Studio does when you say add service reference).

In NAV R2 we do not have anything called a service reference, but we do have .net interop and it actually comes pretty close.

If you start a Visual Studio command prompt and type in the following commands:

svcutil /out:Proxy1.cs sb://navdemo.servicebus.windows.net/Proxy1/mex
c:\Windows\Microsoft.NET\Framework\v3.5\csc /t:library Proxy1.cs

then, if successful you should now have a Proxy1.dll.

dos

The reason for using the .net 3.5 compiler is to get a .net 3.5 assembly, which is the version of .net used for the Service Tier and the RoleTailored Client. If you compile a .net 4.0 assembly, NAV will not be able to use it.

Copy this DLL to the Add-Ins folder of the Classic Client and the Service Tier..

  • C:\Program Files\Microsoft Dynamics NAV\60\Classic\Add-Ins
  • C:\Program Files\Microsoft Dynamics NAV\60\Service\Add-Ins

You also need to copy it to the RoleTailored Client\Add-Ins folder if you are planning to use Client side .net interop.

Create a codeunit and add the following variables:

Name          DataType Assembly             Class
proxy1client  DotNet   Proxy1               ProxyClassClient   
securityMode  DotNet   System.ServiceModel  System.ServiceModel.BasicHttpSecurityMode
binding       DotNet   System.ServiceModel  System.ServiceModel.BasicHttpBinding
endpoint      DotNet   System.ServiceModel  System.ServiceModel.EndpointAddress

all with default properties, meaning that they will run on the Service Tier. Then write the following code:

OnRun()
securityMode := 1;
binding := binding.BasicHttpBinding(securityMode);
endpoint := endpoint.EndpointAddress('
https://navdemo.servicebus.windows.net/https/Proxy1/');
proxy1client := proxy1client.ProxyClassClient(binding, endpoint);
MESSAGE(proxy1client.GetCustomerName('20000'));

Now you of course cannot run this codeunit from the Classic Client, but you will have to add an action to some page, running this codeunit.

Note: When running the code from the RoleTailored Client, you might get an error (or at least I did) stating that it couldn’t resolve the name navdemo.servicebus.windows.net – I solved this by letting the NAV Service Tier and Web Service Listener run as a domain user instead of NETWORK SERVICE.

If everything works, you should get a message box like this:

Selangorian Ltd

So once you have created the Client, using the Client is just calling a function on the Client class.

I will do a more in-dept article about .net interop and Web Services at a later time.

Connecting to Proxy1 from a Windows Phone 7

Windows Phone 7 is .net and Silverlight – meaning that it is just as easy as writing a Console App – almost…

Especially since our Service is hosted on Azure we can just write exactly the same code for creating the Client object. Invoking the GetCustomerName is a little different – since Silverlight only supports async Web Services. So, you would have to setup a handler for the response and then invoke the method.

Another difference is, that the Windows Azure AppFabric SDK doesn’t exist for Windows Phone 7 (or rather, it doesn’t while I am writing this – maybe it will at some point in the future). This of course means that we will use our Https endpoint and BasicHttpBinding to connect with.

Start Visual Studio 2010 and create a new application of type Windows Phone Application under the Silverlight for Windows Phone templates.

Add a Service Reference to sb://navdemo.servicebus.windows.net/Proxy1/mex (note that you must have Windows Azure AppFabric SDK installed on the developer machine in order to resolve this URL). In your Windows Phone application, the config file will only contain the endpoints that are using a binding, which is actually compatible with Windows Phone, meaning that all NetTcpRelayBinding endpoints will be stripped away.

Add the following code to the MainPage.xaml.cs:

// Constructor
public MainPage()
{
    InitializeComponent();
    var client = new Proxy1.ProxyClassClient("BasicHttpRelayBinding_IProxyClass");
    client.GetCustomerNameCompleted += new EventHandler<Proxy1.GetCustomerNameCompletedEventArgs>(client_GetCustomerNameCompleted);
    client.GetCustomerNameAsync("30000");
}

void client_GetCustomerNameCompleted(object sender, Proxy1.GetCustomerNameCompletedEventArgs e)
{
    this.PageTitle.Text = e.Result;
}

Running this in the Windows Phone Emulator gives the following:

windowsphoneproxy1

Wow – it has never been easier to communicate with a locally installed NAV through Web Services from an application running on a Phone – anywhere in the world.

The next post will talk about how Proxy1 is created and what it takes to expose services on the Servicebus.

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV


Connecting to NAV Web Services from the Cloud–part 3 out of 5

$
0
0

If you haven’t already read part 2 you should do so here, before continuing to read this post.

In part 2 I talked about how to connect to my locally installed NAV Web Service Proxy from anywhere in the world and towards the end, I promised that I would explain how the proxy was build. Problem was, that while writing this post I ran into a bug in the Servicebus – which was really annoying.

The bug causes my service to stop listening after a number of hours or days – and as such, I couldn’t create a reliable way to host a service on the Servicebus and I didn’t want to post information about how to do stuff like this and then stand the risk of having mislead a number of people with all kinds of problems to follow.

In the beginning I thought this problem was caused by inactivity on the service, but after several tests (which took days each of these) I found that the problem only occurs when hosting a metadata endpoint in the cloud. Now one can argue that a metadata endpoint is only for development purposes – yes, but we are also doing development here, so I have to do that.

Anyway – knowing that I have a workaround, I will now explain how Proxy1 is created.

Proxy1

The solution I will create consists of 2 projects. One project compiles to a DLL and contains the ServiceHost and the actual Proxy. The other project is a Console application, which is used as a host application for the DLL. Later on I will add 2 projects more – a Windows Service and an installer – much like explained in this post http://blogs.msdn.com/b/freddyk/archive/2010/01/30/web-services-infrastructure-and-how-to-create-an-internal-proxy.aspx

The Proxy1 Windows Service or Console Application will be running on the local network, next to the NAV Service Tier and the application/Service should be running as a user, who has access to NAV. In one of the later posts, I will explain about authentication and security and ways to make this safer.

First of all – we need to define our Service Contract:

[ServiceContract]
public interface IProxyClass
{
    [OperationContract]
    string GetCustomerName(string No);
}

and secondly the implementation of this Proxy:

[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single, IncludeExceptionDetailInFaults = true)]
public class ProxyClass : IProxyClass
{
    public ProxyClass()
    {
    }

    public string GetCustomerName(string No)
    {
        Debug.Write(string.Format("GetCustomerName(No = {0}) = ", No));
        CustomerCard_Service service = new CustomerCard_Service();
        service.UseDefaultCredentials = true;
        CustomerCard customer = service.Read(No);
        if (customer == null)
        {
            Debug.WriteLine("Customer not found");
            return string.Empty;
        }
        Debug.WriteLine(customer.Name);
        return customer.Name;
    }
}

as you can see, the implementation has a reference to the Customer Card in my NAV – and I am using default authentication (current user) and just calling into NAV to read a Customer and then return the name – almost as simple as a Hello World sample.

This part is more or less exactly the same proxy as described in this post http://blogs.msdn.com/b/freddyk/archive/2010/01/30/web-services-infrastructure-and-how-to-create-an-internal-proxy.aspx– now the question is, how do we make this proxy accessible from everywhere in the world.

Hosting a WCF Service on the Servicebus

In my prior post about creating an internal proxy, we would use the following lines to create a ServiceHost:

        host = new ServiceHost(new MyInternalProxy(), new Uri(URL));
        host.AddServiceEndpoint(typeof(IMyInternalProxy), new BasicHttpBinding(), "");
        ServiceMetadataBehavior smb = new ServiceMetadataBehavior();
        smb.HttpGetEnabled = true;
        smb.HttpGetUrl = new Uri(URL);
        host.Description.Behaviors.Add(smb);

This would of course create a host and listen on the URL. One could think that you could replace the URL with a magic URL on the service bus and then everything would work. Well – it is not THAT simple.

In my samples I have created a ServiceClass, which then is used from my Console App and from my Windows Service.

public class ServiceClass
{
    ServiceHost serviceHost = null;
    string instanceId = "Proxy1";
    bool includeMex;

    public ServiceClass(bool includeMex)
    {
        this.includeMex = includeMex;
    }

    void InitializeServiceHost()
    {
        serviceHost = new ServiceHost(new ProxyClass());

        // sb:// binding
        Uri sbUri = ServiceBusEnvironment.CreateServiceUri("sb", "navdemo", instanceId);
        var sbBinding = new NetTcpRelayBinding(EndToEndSecurityMode.Transport, RelayClientAuthenticationType.None);
        serviceHost.AddServiceEndpoint(typeof(IProxyClass), sbBinding, sbUri);

        // https:// binding (for Windows Phone etc.)
        Uri httpsUri = ServiceBusEnvironment.CreateServiceUri("https", "navdemo", "https/" + instanceId);
        var httpsBinding = new BasicHttpRelayBinding(EndToEndBasicHttpSecurityMode.Transport, RelayClientAuthenticationType.None);
        serviceHost.AddServiceEndpoint(typeof(IProxyClass), httpsBinding, httpsUri);

        if (this.includeMex)
        {
            // sb:// Metadata endpoint
            Uri mexUri = new Uri(sbUri.AbsoluteUri + "mex");
            var mexBinding = new NetTcpRelayBinding(EndToEndSecurityMode.Transport, RelayClientAuthenticationType.None);
            ServiceMetadataBehavior smb = new ServiceMetadataBehavior();
            serviceHost.Description.Behaviors.Add(smb);
            serviceHost.AddServiceEndpoint(typeof(IMetadataExchange), mexBinding, mexUri);
        }

        // Setup Shared Secret Credentials for hosting endpoints on the Service Bus
        string issuerName = "name";
        string issuerSecret = "secret";
        TransportClientEndpointBehavior sharedSecretServiceBusCredential = new TransportClientEndpointBehavior();
        sharedSecretServiceBusCredential.CredentialType = TransportClientCredentialType.SharedSecret;
        sharedSecretServiceBusCredential.Credentials.SharedSecret.IssuerName = issuerName;
        sharedSecretServiceBusCredential.Credentials.SharedSecret.IssuerSecret = issuerSecret;

        // Set credentials on all endpoints on the Service Bus
        foreach (ServiceEndpoint endpoint in serviceHost.Description.Endpoints)
        {
            endpoint.Behaviors.Add(sharedSecretServiceBusCredential);
        }

        Debug.WriteLine(string.Format("{0} Initialized", this.GetType().FullName));
    }

    public void StartHosts()
    {
        if (this.serviceHost == null)
        {
            Debug.WriteLine(string.Format("{0} Initializing...", this.GetType().FullName));
            InitializeServiceHost();
        }
        if (this.serviceHost != null && serviceHost.State != CommunicationState.Opened && serviceHost.State != CommunicationState.Opening)
        {
            Debug.WriteLine(string.Format("{0} Opening...", this.GetType().FullName));
            this.serviceHost.Open();
            Debug.WriteLine(string.Format("{0} Opened", this.GetType().FullName));
        }
    }

    public void StopHosts()
    {
        if (this.serviceHost != null && serviceHost.State != CommunicationState.Closed && serviceHost.State != CommunicationState.Closing)
        {
            Debug.WriteLine(string.Format("{0} Closing...", this.GetType().FullName));
            this.serviceHost.Close();
            Debug.WriteLine(string.Format("{0} Closed", this.GetType().FullName));
        }
        this.serviceHost = null;
    }
}

and yes – it is slightly more complicated, but if you look twice it isn’t that different. For each endpoint you will host, you will need a Type of the Service Contract, a binding (describing the communication protocol – BasicHttpBinding in the “old” sample), and a URL where the Service is hosted.

        // sb:// binding
        Uri sbUri = ServiceBusEnvironment.CreateServiceUri("sb", "navdemo", instanceId);
        var sbBinding = new NetTcpRelayBinding(EndToEndSecurityMode.Transport, RelayClientAuthenticationType.None);
        serviceHost.AddServiceEndpoint(typeof(IProxyClass), sbBinding, sbUri);

The bindings for the Servicebus are Relay bindings and you will find these when referencing the Microsoft.Servicebus DLL.

The NetTcpRelayBinding class takes two parameters – securitymode (which is set to Transport here – meaning that we will have a secure line) and a RelayAuthenticationType, which is set to None – everybody can access this endpoint. Setting the RelayClientAuthenticationType to RelayAccessToken means that an application connecting to this endpoint will have to specify a specify RelayAuthenticationToken to connect to this endpoint, which again means that your proxy will never get called unless people can get by the 1st level authentication done on the endpoint.

My other binding is https – and again RelayAuthenticationType is set to None. The primary reason for the https endpoint is Windows Phone – Windows Phone doesn’t know about the sb:// protocol and it also doesn’t know anything about RelayAuthenticationToken – so this needs to be None.

After setting up the endpoints, we add SharedSecretServicebusCredential behaviors to all endpoints. Reason for this is that the ServiceBus require authentication in order to host an endpoint on the Servicebus. People cannot just go and host endpoints on my Servicebus domain – and then leave the bill to me.

The value of these variables:

        string issuerName = "name";
        string issuerSecret = "secret";

is given from the Windows Azure AppFabric account when you sign up.

Furthermore – the name “navdemo” (when creating the ServiceUri is registered by me. In order to use this you will have to sign up for a Windows Azure account and register your own service namespace.

The Console App

is very simple, just an app. hosting the ServiceClass. The only thing complicating this is, that I want to restart the servicehost for every hour or so if we are hosting a metadata endpoint – in order to avoid the Servicebus bug.

class Program
{
     static Proxy1.ServiceClass proxy1;
     static bool includeMex = true;

     static void Main(string[] args)
     {
         Console.WindowWidth = 120;
         Console.WindowHeight = 50;
         Debug.Listeners.Add(new TextWriterTraceListener(System.Console.Out));

         if (includeMex)
         {
             Timer timer = new Timer(3600000); // 1 hour in milliseconds
             timer.Elapsed += new ElapsedEventHandler(timer_Elapsed);
             timer.Start();
             Console.WriteLine("Timer started - listening for 1 hour");
         }

         proxy1 = new Proxy1.ServiceClass(includeMex);
         proxy1.StartHosts();
         Console.ReadLine();
         proxy1.StopHosts();
     }

     static void timer_Elapsed(object sender, ElapsedEventArgs e)
     {
         Console.WriteLine("Timer elapsed - restart listener");
         proxy1.StopHosts();
         proxy1.StartHosts();
     }
}

When running in production, includeMex should be false.

You can download the entire solution here– but of course you cannot run the solution before you have a Windows Azure account.

More information about Servicebus (Windows Azure AppFabric) can be found here:

Windows Azure AppFabric general information (and free trial)

AppFabric Service Bus Tutorial and information about how to create and account and a namespace

Windows Azure AppFabric on MSDN

http://blogs.msdn.com/b/windowsazureappfabric/

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

Connecting to NAV Web Services from the Cloud–part 4 out of 5

$
0
0

If you haven’t already read part 3 you should do so here, before continuing to read this post.

By now you have seen how to create a WCF Service Proxy connected to NAV with an endpoint hosted on the Servicebus (Windows Azure AppFabric). By now, I haven’t written anything about security yet and the Proxy1 which is hosted on the Servicebus is available for everybody to connect to anonymously.

In the real world, I cannot imagine a lot of scenarios where a proxy which can be accessed anonymously will be able to access and retrieve data from your ERP system, so we need to look into this and there are a lot of different options – but unfortunately, a lot of these doesn’t work from Windows Phone 7.

ACS security on Windows Azure AppFabric

The following links points to documents describing how to secure services hosted on the Servicebus through the Access Control Service (ACS) built into Windows Azure AppFabric.

An Introduction to Microsoft .NET Services for Developers

A Developer’s Guide to the Service Bus

A Developer’s Guide to the .NET Access Control Service

Windows Azure platform

A lot of good stuff and if you (like me) try to dive deep into some of these topics, you will find that using this technology, you can secure services without changing the signature of your service. Basically you are using claim based security, asking a service for a token with which you can connect to a service. You get a token back (which will be valid for a given timeframe) and while this token is valid you can call the service by specifying this token when calling the service.

This functionality has two problems:

  1. It doesn’t work with Windows Phone 7 (nor any other mobile device) – YET! In the future, it probably will – but nobody made a servicebus assembly for Windows Phone, PHP, Java, iPhone or any of those. See also: http://social.msdn.microsoft.com/Forums/en-US/netservices/thread/12660485-86fb-4b4c-9417-d37fe183b4e1
  2. Often we need to connect to NAV with different A/D users depending on which user is connecting from the outside.

I am definitely not a security expert – but whatever security system I want to put in place must support the following two basic scenarios:

  • My employees, who are using Windows Phone 7 (or other applications), needs to authenticate to NAV using their own domain credentials – to have the same permissions in NAV as they have when running the Client.
  • Customers and vendors should NOT have a username and a password for a domain user in my network, but I still want to control which permissions they have in NAV when connecting.

Securing the access to NAV

I have created the following poor mans authentication to secure the access to NAV and make sure that nobody gets past the Proxy unless they have a valid username and password. I will of course always host the endpoint on a secure connection (https://) and the easiest way around adding security is to extend the methods in the proxy to contain a username and a password – like:

[ServiceContract]
public interface IProxyClass
{
    [OperationContract]
    string GetCustomerName(string username, string password, string No);
}

This username and password could then be a domain user and a domain password, but depending on the usage this might not be the best of ideas.

In the case of the Windows Phone application, where a user is connecting to NAV through a proxy, I would definitely use the domain username and password (like it is the case with Outlook on the Windows Phone). In the case, where an application on a customer/vendor site should connect to your local NAV installation over the Servicebus – it doesn’t seem like a good idea to create a specific domain user and password for this customer/vendor and maintain that.

In my sample here – I will allow both. Basically I will just say, that if the username is of the format domain\user – then I will assume that this is a domain user and password – and use that directly as credentials on the service connection to NAV.

If the username does NOT contain a backslash, then I will lookup in a table for this username and password combination and find the domain, username and password that should be used for this connection. This table can either be a locally hosted SQL Express – or it could be a table in NAV and then I could access this through another web service. I have done the latter.

In NAV, I have created a table called AuthZ:

image

This table will map usernames and passwords to domain usernames and passwords.

As a special option I have added a UseDefault – which basically means that the Service will use default authentication to connect to NAV, which again would be the user which is running the Service Host Console App or Windows Service. I have filled in some data into this table:

image

and created a Codeunit called AuthZ (which I have exposed as a Web Service as well):

GetDomainUserPassword(VAR Username : Text[80];VAR Password : Text[80];VAR Domain : Text[80];VAR UseDefault : Boolean) : Boolean
IF NOT AuthZ.GET(Username) THEN
BEGIN
  AuthZ.INIT;
  AuthZ.Username := Username;
  AuthZ.Password := CREATEGUID;
  AuthZ.Disabled := TRUE;
  AuthZ.Attempts := 1;
  AuthZ.AllTimeAttempts := 1;
  AuthZ.INSERT();
  EXIT(FALSE);
END;
IF AuthZ.Password <> Password THEN
BEGIN
  AuthZ.Attempts := AuthZ.Attempts + 1;
  AuthZ.AllTimeAttempts := AuthZ.AllTimeAttempts + 1;
  IF AuthZ.Attempts > 3 THEN
    AuthZ.Disabled := TRUE;
  AuthZ.MODIFY();
  EXIT(FALSE);
END;
IF AuthZ.Disabled THEN
  EXIT(FALSE);
IF AuthZ.Attempts > 0 THEN
BEGIN
  AuthZ.Attempts := 0;
  AuthZ.MODIFY();
END;
UseDefault := AuthZ.UseDefault;
Username := AuthZ.DomainUsername;
Password := AuthZ.DomainPassword;
Domain := AuthZ.DomainDomain;
EXIT(TRUE);

As you can see, the method will log the number of attempts done on a given user and it will automatically disable a user if it has more than 3 failed attempts. In my sample, I actually create a disabled account for attempts to use wrong usernames – you could discuss whether or not this is necessary.

In my proxy (C#) I have added a method called Authenticate:

private void Authenticate(SoapHttpClientProtocol service, string username, string password)
{
    if (string.IsNullOrEmpty(username))
        throw new ArgumentNullException("username");
    if (username.Length > 80)
        throw new ArgumentException("username");

    if (string.IsNullOrEmpty(password))
        throw new ArgumentNullException("password");
    if (password.Length > 80)
        throw new ArgumentException("password");

    string[] creds = username.Split('\\');

    if (creds.Length > 2)
        throw new ArgumentException("username");

    if (creds.Length == 2)
    {
        // Username is given by domain\user - use this
        service.Credentials = new NetworkCredential(creds[1], password, creds[0]);
    }
    else
    {
        // Username is a simple username (no domain)
        // Use AuthZ web Service in NAV to get the Windows user to use for this user
        AuthZref.AuthZ authZservice = new AuthZref.AuthZ();
        authZservice.UseDefaultCredentials = true;
        string domain = "";
        bool useDefault = true;
        if (!authZservice.GetDomainUserPassword(ref username, ref password, ref domain, ref useDefault))
            throw new ArgumentException("username/password");
        if (useDefault)
            service.UseDefaultCredentials = true;
        else
            service.Credentials = new NetworkCredential(username, password, domain);
    }
}

The only task for this function is to authenticate the user and set the Credentials on a NAV Web Reference (SoapHttpClientProtocol derived). If something goes wrong, this function will throw and exception and will return to the caller of the Service.

The implementation of GetCustomerName is only slightly different in relation to the one implemented in part 3 – only addition is to call Authenticate after creating the service class:

public string GetCustomerName(string username, string password, string No)
{
    Debug.Write(string.Format("GetCustomerName(No = {0}) = ", No));
    CustomerCard_Service service = new CustomerCard_Service();
    try
    {
        Authenticate(service, username, password);
    }
    catch (Exception ex)
    {
        Debug.WriteLine(ex.Message);
        throw;
    }
    CustomerCard customer = service.Read(No);
    if (customer == null)
    {
        Debug.WriteLine("Customer not found");
        return string.Empty;
    }
    Debug.WriteLine(customer.Name);
    return customer.Name;
}

So, based on this we now have an endpoint on which we will have to specify a valid user/password combination in order to get the Customer Name – without having to create / give out users in the Active Directory.

Note that with security it is ALWAYS good to create a threat model and depending on the data you expose might want to add more security – options could be:

Connecting to NAV Web Services from the Cloud–part 5 out of 5

$
0
0

If you haven’t already read part 4 (and the prior parts) you should do so here, before continuing to read this post.

In this post, I am going to create a small Windows Phone 7 application, which basically will be a phone version of the sidebar gadgets from this post. When we are done, your Windows Phone 7 will look like:

WP7

During the other posts, I have been describing how to make the Proxy and one way of securing this.

For this sample, I have created a new Proxy (Proxy2) and added 3 functions to this proxy:

Customer[] GetMyCustomers(string username, string password)
Vendor[] GetMyVendors(string username, string password)
Item[] GetMyItems(string username, string password)

You can imagine that these functions are implemented in the Proxy simply by authenticating and calling the corresponding function in the MyStuff codeunit from this post, I will not go into further detail about how the Proxy is done.

A small Console Application

Try the following:

  • In Visual Studio 2010, create a Windows Console Application
  • Add a service reference to sb://navdemo.servicebus.windows.net/Proxy2/mex and use the namespace Proxy2 (note that you need the Windows Azure AppFabric SDK to do this and you can download that here).
  • Use the following code in main:

static void Main(string[] args)
{
    Proxy2.ProxyClassClient client = new Proxy2.ProxyClassClient("NetTcpRelayBinding_IProxyClass");
    foreach(Proxy2.Customer customer in client.GetMyCustomers("freddy", "password"))
        Console.WriteLine(string.Format("{0} {1}", customer.Name, customer.Phone));
    Console.ReadLine();
}

  • Run the app, and you should get something like:

image

This in effect calls my NAV proxy2 (which is placed in Redmond) through the Service bus and returns My Customers. I will try to keep the server running, but please understand that it might be down for various reasons (I might be doing development work on the Proxy).

A Windows Phone 7 application

Now for the real thing.

In order to create solutions for Windows Phone 7, you will need the developer tools and they can be downloaded for free here: http://create.msdn.com/en-us/home/getting_started:

There are three steps to the install process:

  1. Download and install the Windows Phone Developer Tools (Release Notes)
  2. Download and install the Windows Phone Developer Tools January 2011 Update (Release Notes) [Note: Installation may take several minutes and is complete when the install dialog box closes.]
  3. Download and install the Windows Phone Developer Tools Fix

Trist smiley

Anyway – when done – you are ready.

I created a Windows Phone Panorama Application from the templates:

image

After this, I add a service reference to the Proxy (sb://navdemo.servicebus.windows.net/Proxy2/mex) with the namespace Proxy2.

Looking at the ServiceReferences.ClientConfig, you will see that only the http:// and the https:// endpoints are mentioned here as the Windows Phone doesn’t support sb://.

In Windows Phone applications it is common to have a ViewModel, which is the data for the app. The templates comes with a default ViewModel, which we need to modify:

Declaring the collections:

/// <summary>
/// Collections for My stuff objects.
/// </summary>
public ObservableCollection<Proxy2.Customer> MyCustomers { get; private set; }
public ObservableCollection<Proxy2.Vendor> MyVendors { get; private set; }
public ObservableCollection<Proxy2.Item> MyItems { get; private set; }

Initializing the ViewModel:

public MainViewModel()
{
    this.MyCustomers = new ObservableCollection<Proxy2.Customer>();
    this.MyVendors = new ObservableCollection<Proxy2.Vendor>();
    this.MyItems = new ObservableCollection<Proxy2.Item>();
}

Loading the data:

/// <summary>
/// Load data into collections
/// </summary>
public void LoadData()
{
    BasicHttpBinding binding;
    EndpointAddress endpoint;

    // Emulator doesn't support HTTPS reliably
    binding = new BasicHttpBinding(BasicHttpSecurityMode.None);
    endpoint = new EndpointAddress("
http://navdemo.servicebus.windows.net/http/Proxy2/");
 
    Proxy2.ProxyClassClient client = new Proxy2.ProxyClassClient(binding, endpoint);

    client.GetMyCustomersCompleted += new EventHandler<Proxy2.GetMyCustomersCompletedEventArgs>(client_GetMyCustomersCompleted);
    client.GetMyCustomersAsync("freddy", "password");

    client.GetMyVendorsCompleted += new EventHandler<Proxy2.GetMyVendorsCompletedEventArgs>(client_GetMyVendorsCompleted);
    client.GetMyVendorsAsync("freddy", "password");

    client.GetMyItemsCompleted += new EventHandler<Proxy2.GetMyItemsCompletedEventArgs>(client_GetMyItemsCompleted);
    client.GetMyItemsAsync("freddy", "password");

    this.IsDataLoaded = true;
}

void client_GetMyCustomersCompleted(object sender, Proxy2.GetMyCustomersCompletedEventArgs e)
{
    foreach (Proxy2.Customer customer in e.Result)
        this.MyCustomers.Add(customer);
}

void client_GetMyVendorsCompleted(object sender, Proxy2.GetMyVendorsCompletedEventArgs e)
{
    foreach (Proxy2.Vendor vendor in e.Result)
        this.MyVendors.Add(vendor);
}

void client_GetMyItemsCompleted(object sender, Proxy2.GetMyItemsCompletedEventArgs e)
{
    foreach (Proxy2.Item item in e.Result)
        this.MyItems.Add(item);
}

As you can see, the data is loaded using asynchronous data access – as everything on the phone works this way.

In the SampleData folder, I have modified the Sample Data xaml for the ViewModel to:

<local:MainViewModel
    xmlns="
http://schemas.microsoft.com/winfx/2006/xaml/presentation"      
    xmlns:x="
http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:local="clr-namespace:MyStuff"
    xmlns:proxy2="clr-namespace:MyStuff.Proxy2">
   
    <local:MainViewModel.MyCustomers>
        <proxy2:Customer No="1" Name="Freddy Kristiansen" Phone="(425) 111-2222" EMail="freddy@cronus.com" />
        <proxy2:Customer No="2" Name="Pernille Kristiansen" Phone="(425) 222-3333" EMail="pernille@cronus.com" />
    </local:MainViewModel.MyCustomers>

    <local:MainViewModel.MyVendors>
        <proxy2:Vendor No="1" Name="Niklas Kristiansen" Phone="(425) 333-4444" />
        <proxy2:Vendor No="2" Name="Mads Kristiansen" Phone="(425) 444-5555" />
        <proxy2:Vendor No="3" Name="Jonas Kristiansen" Phone="(425) 555-6666" />
    </local:MainViewModel.MyVendors>

    <local:MainViewModel.MyItems>
        <proxy2:Item No="1" Description="Bike"  />
        <proxy2:Item No="2" Description="Car" />
    </local:MainViewModel.MyItems>

</local:MainViewModel>

Also the xaml for the panorama control in the MainPage has been modified to include 3 PanoramaItems, which binds to Customers, Vendors and Items:

<!--Panorama control-->
<controls:Panorama Title="my stuff">
    <controls:Panorama.Background>
        <ImageBrush ImageSource="PanoramaBackground.png"/>
    </controls:Panorama.Background>

    <!-- My Customers -->
    <controls:PanoramaItem Header="My Customers">
        <ListBox Name="lbMyCustomers" Margin="0,0,-12,0" ItemsSource="{Binding MyCustomers}" ManipulationStarted="ListBox_ManipulationStarted" ManipulationCompleted="ListBox_ManipulationCompleted" MouseLeftButtonUp="ListBox_MouseLeftButtonUp" ManipulationDelta="ListBox_ManipulationDelta">
            <ListBox.ItemTemplate>
                <DataTemplate>
                    <StackPanel Margin="0,0,0,17" Width="432">
                        <TextBlock Text="{Binding Name}" TextWrapping="NoWrap" Style="{StaticResource PhoneTextExtraLargeStyle}"/>
                        <TextBlock Text="{Binding Phone}" TextWrapping="NoWrap" Margin="12,-6,12,0" Style="{StaticResource PhoneTextSubtleStyle}"/>
                    </StackPanel>
                </DataTemplate>
            </ListBox.ItemTemplate>
        </ListBox>
    </controls:PanoramaItem>

    <!-- My Vendors -->
    <controls:PanoramaItem Header="My Vendors">
        <ListBox Name="lbMyVendors" Margin="0,0,-12,0" ItemsSource="{Binding MyVendors}" ManipulationDelta="lbMyVendors_ManipulationDelta" ManipulationStarted="lbMyVendors_ManipulationStarted" MouseLeftButtonUp="lbMyVendors_MouseLeftButtonUp">
            <ListBox.ItemTemplate>
                <DataTemplate>
                    <StackPanel Margin="0,0,0,17" Width="432">
                        <TextBlock Text="{Binding Name}" TextWrapping="NoWrap" Style="{StaticResource PhoneTextExtraLargeStyle}"/>
                        <TextBlock Text="{Binding Phone}" TextWrapping="NoWrap" Margin="12,-6,12,0" Style="{StaticResource PhoneTextSubtleStyle}"/>
                    </StackPanel>
                </DataTemplate>
            </ListBox.ItemTemplate>
        </ListBox>
    </controls:PanoramaItem>

    <!-- My Items -->
    <controls:PanoramaItem Header="My Items">
        <ListBox Name="lbMyItems" Margin="0,0,-12,0" ItemsSource="{Binding MyItems}">
            <ListBox.ItemTemplate>
                <DataTemplate>
                    <StackPanel Margin="0,0,0,17" Width="432">
                        <TextBlock Text="{Binding Description}" TextWrapping="NoWrap" Style="{StaticResource PhoneTextExtraLargeStyle}"/>
                        <TextBlock Text="{Binding No}" TextWrapping="NoWrap" Margin="12,-6,12,0" Style="{StaticResource PhoneTextSubtleStyle}"/>
                    </StackPanel>
                </DataTemplate>
            </ListBox.ItemTemplate>
        </ListBox>
    </controls:PanoramaItem>

</controls:Panorama>

That’s it – running the application in the Windows Phone Emulator should give you the desired output.

If you want to download the entire MyStuff solution and try it out, you can download it here.

Note, that this application doesn’t do anything for local caching or updating of data, nor does it do anything when the App is tombstoned – it merely shows how to access data. The solution does however also support Dialing customers or vendors just by clicking on the line on the phone.

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

Utilizing Cloud Services – Part 1

$
0
0

During 2011 I have conducted sessions on utilizing cloud services at various events (Convergence US, Directions EMEA, Dutch Dynamics Community Event, NAV Tech Days, Airlift in Munich and Directions US – and every time I have promised that I would blog about how to do these cool things – how can you as a NAV Partner or a NAV Customer take advantage of the Cloud.

So this is the start of a series, that will talk about Cloud Services. As you might have noticed, there is no ending part number (Part 1 of x) – reason is, that the number of Cloud Services out there is not fixed – and if I run into other cloud services later on – I will add new posts to cover these.

Microsoft Dynamics NAV in the Cloud

These posts are not about NAV in the Cloud. Yes, Microsoft Dynamics NAV will be available in the Cloud – we are working on that – but what the exact strategy will be for this offering is not mine to reveal at this time.

When time is right, I will of couse write about this.

What is Cloud Services?

To answer this question, we should probably start by answering: “What is Cloud Computing?”. Wikipedia has the following answer to that question

Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility (like the electricity grid) over a network (typically the Internet).

So now that should be clear to everyoneSmiley

My definition of Cloud Services (or at least what I will talk about) is that Cloud Services are services that are available in the cloud rather through a product you would have to install and maintain yourself.

We use Cloud Services all the time. When we check the weather forecast, checking in when we travel, sending e-mails, tweeting, listening to music on spotify, chatting and a lot of other things.

So I am of course not here to tell you how to update your facebook status on a daily basis or anything like that – I am here to tell you about how to integrate services in the cloud with your Microsoft Dynamics NAV and how to take advantage of these things. Some times there will be a clear usage pattern for using a service – in other cases I will just explain how to do this, because I can – and it is up to you to see whether you can use this in your business.

Having talked about Cloud Services – I also have to mention Windows Azure.

What is Windows Azure?

Windows Azure is Microsofts platform in the cloud and instead of writing a lot of text, I will include a small video, which I think does a good job of explaining this:

What is the Windows Azure Platform?

On top of the Windows Azure platform, Microsoft offers a number of services, which I of course will include in the list of Cloud Services. Some of these will require you to have a subscription to the Windows Azure Platform. You sign up for a subscription at http://windows.azure.com, by clicking at the Sign up button:

image

The Pay-As-You-Go option allows you to pay for only the stuff you need and you can investigate the rates before you use anything. Some of the services I will be talking about are:

Azure Storage

Current monthly price is $0.15 pr GB (including 10,000 storage transactions – this is the number of times you read/write your data). Azure Storage is used for blob storage, queue storage and table storage – and I will post examples on how to use these things.

Servicebus

Current monthly price is $9.95 for 5 simultaneous connections. The use cases I have for using the Service Bus typically open and close the connections every time, meaning that the 5 connection pack will get you a long way. The Servicebus is used for connecting services (on-prem or hosted) in a secure way through the cloud.

SQL Azure

Current price is $99.99 a month for 10GB database. I must admit, that I find this rather expensive (even though it does include SQL licenses, server licenses, hardware, connectivity, somebody maintining the thing, high security, high availability and high scalability) – but nevertheless, I will talk about how to take advantage of SQL Azure. There are also an additional cost for outbound data transfers (which means that you pay extra for every GB you transfer from SQL Azure to your local box – $0.15 – $0.20). If you have an Azure hosted service (in the same datacenter) reading the data – it is NOT subject to data transfer costs.

Azure Compute

Current price is $0.12 per hour for a small instance. A Small instance is 1.6Ghz server with 1.75GB RAM, 225GB instance storage and moderate IO performance. A Medium instance is the double of everything – including the price. A Large is again the double of a Medium and a X-Large is again the double of a Large.

Running multiple instances multiplies the price for the instance and so forth.

In this price, the instance is maintained and the OS is kept up-to-date for you, network traffic is high capacity and high reliability. If the hardware is not working, you will get new hardware and so forth. No risk, no worry.

Is that expensive or is it cheap – it all depends on what you need I guess.

One thing I will state though is, that in my opinion, Windows Azure is not ready for hosting standard Virtual Machines yet. So if you “just” want to host your Windows Server 2008 in the Cloud and do everything like you did it before – Windows Azure is not the place yet. Today it is designed as a platform for applications that need high reliability, high scalability and high security (I think i have mentioned that a couple of times).

What’s in it for me?

Maybe you are thinking exactly this question at this time – what is in it for me?

Can I use this for anything?

And of course I cannot answer that question. I can however give a lot of examples on what cloud services can be used for and maybe you will find that the Windows Azure Storage can be used to solve a problem that you have had for a long time – or maybe the Servicebus is exactly what you need to solve your connectivity problem with your on-prem server.

Stay tuned for a number of upcoming posts on Cloud Services.

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

NavContainerHelper – Use an external SQL Server as database connection in a NAV container

$
0
0

If you haven't read the first post about the NavContainerHelper, you should do so.

If you have a created a SQL Server container using one of the methods described any of the blog posts:

Then you will have the variables $databaseServer, $databaseInstance, $databaseName and $databaseCredential pointing to a database you can use to start up a NAV container. These parameters can be given directly to New-NavContainer.

If you have created your external database through other means, please set these variables in PowerShell. Please try the following script to see whether a docker container can connect to your database:

$dbPassword = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($databaseCredential.Password))
$databaseServerInstance = @{ $true = "$databaseServer\$databaseInstance"; $false = "$databaseServer"}["$databaseInstance" -ne ""]
docker run -it --name sqlconnectiontest microsoft/mssql-server-windows-developer powershell -command "Invoke-Sqlcmd -ServerInstance '$databaseServerInstance' -Username '$($databaseCredential.Username)' -Password '$dbPassword' -Database '$databaseName' -Query 'SELECT COUNT(*) FROM [dbo].[User]'"

If the above script fails, you will not succeed starting a NAV container with these credentials, before your connection test succeeds.

Please remove your connection test container using:

docker rm sqlconnectiontest -f

When you successfully have conducted the connection test above, you can start a NAV container using this script:

$navcredential = New-Object System.Management.Automation.PSCredential -argumentList "admin", (ConvertTo-SecureString -String "P@ssword1" -AsPlainText -Force)
New-NavContainer -accept_eula `
                 -containerName "test" `
                 -Auth NavUserPassword `
                 -imageName $imageName `
                 -Credential $navcredential `
                 -databaseServer $databaseServer `
                 -databaseInstance $databaseInstance `
                 -databaseName $databaseName `
                 -databaseCredential $databaseCredential

If your database doesn't have a license file, you can upload a license file using:

Import-NavContainerLicense -containerName test -licenseFile "https://www.dropbox.com/s/abcdefghijkl/my.flf?dl=1"

If you do not import a license file, you are likely to get errors like this when trying to access the NAV container.

The following SQL error was unexpected:
Invalid object name 'master.dbo.$ndo$srvproperty'.

You can add users to the database using:

New-NavContainerNavUser -containerName test -Credential $navcredential

Enjoy

Freddy Kristiansen
Technical Evangelist

NavContainerHelper – Overriding scripts in NAV containers

$
0
0

If you haven't read the first post about the NavContainerHelper, you should do so.

When building, running or restarting the NAV container, the c:\run\start.ps1 script is being run. This script will launch navstart.ps1, which will launch a number of other scripts (listed below in the order in which they are called from navstart.ps1). Each of these scripts exists in the c:\run folder. If a folder called c:\run\my exists and a script with the same name is found in that folder, then that script will be executed instead of the script in c:\run (called overriding scripts).

Overriding scripts id done by specifying the scripts you want to override in the myscripts parameter. The myScripts parameter is an array and every item can be one of the following:

  • A filename on the host. This file will be copied to the container's c:\run\my folder with the same name. If the file is a .zip file, it will be extracted to the c:\run\my folder.
  • A secure URL to a file. This file will be downloaded to the container's c:\run\my folder with the same name. If the file is a .zip file, it will be extracted to the c:\run\my folder.
  • A folder name on the host. The content of this folder is copied to the c:\run\my folder. .zip files in the folder will not be extracted.
  • A hashtable with the filename and content of files to be written in the c:\run\my folder.

Example (filename):

New-NavContainer -accept_eula `
                 -containerName test `
                 -imageName microsoft/dynamics-nav `
                 -auth NavUserPassword `
                 -myScripts @("c:\temp\AdditionalOutput.ps1")

Example (secure url):

New-NavContainer -accept_eula `
                 -containerName test `
                 -imageName microsoft/dynamics-nav `
                 -auth NavUserPassword `
                 -myScripts @("https://www.dropbox.com/s/yokximlfz2vws2i/additionalOutput.ps1?dl=1")

Example (.zip file):

New-NavContainer -accept_eula `
                 -containerName test `
                 -imageName microsoft/dynamics-nav `
                 -auth NavUserPassword `
                 -myScripts @("https://www.dropbox.com/s/y3na6vxxjlg2xig/myoverrides.zip?dl=1")

Example (hashtable):

$additionalOutputScript = @"
Write-Host "--------------------"
Write-Host "| AdditionalOutput |"
Write-Host "--------------------"
"@

New-NavContainer -accept_eula `
                 -containerName test `
                 -imageName microsoft/dynamics-nav `
                 -auth NavUserPassword `
                 -myScripts @{"AdditionalOutput.ps1" = $additionalOutputScript} 

The output of the container should be something like:

...
Container IP Address: 172.19.155.150
Container Hostname  : test
Container Dns Name  : test
Web Client          : http://test/NAV/
Dev. Server         : http://test
Dev. ServerInstance : NAV
--------------------
| AdditionalOutput |
--------------------

Files:
http://test:8080/al-0.12.17720.vsix

The list below is all the overridable scripts in the c:\run folder, a link to the source code and a description of their responsibility.

  1. navstart.ps1 - navstart is the very first script to run and is responsible for running the following scripts.
  2. Helperfunctions.ps1 - set of helper functions.
  3. SetupVariables.ps1 - read environment variables and set PowerShell variables.
  4. SetupDatabase.ps1 - setup the database used for this container.
  5. SetupCertificate.ps1 - setup certificate to use.
  6. SetupConfiguration.ps1 - setup configuration for service tier.
  7. SetupAddIns.ps1 - setup addins in service tier and roletailored client folders.
  8. SetupLicense.ps1 - setup license to use.
  9. SetupTenant.ps1 - setup tenant (if multitenancy).
  10. SetupWebClient.ps1 - setup Web Client (this script is different for different versions of NAV).
  11. SetupWebConfiguration.ps1 - setup Web Client configuration (default file is empty).
  12. SetupFileShare.ps1 - setup file share with certificate, vsix file and more.
  13. SetupWindowsUsers.ps1 - setup Windows users.
  14. SetupSqlUsers.ps1 - setup SQL users.
  15. SetupNavUsers.ps1 - setup NAV users.
  16. SetupClickOnce.ps1 - setup ClickOnce deployed Windows Client (this script is different for different versions of NAV).
  17. SetupClickOnceDirectory.ps1 - setup ClickOnce directory with the necessary files for Windows Client deployment.
  18. AdditionalSetup.ps1 - additional setup script (default file is empty).
  19. AdditionalOutput.ps1 - additional output script (default file is empty).
  20. MainLoop.ps1 - NAV container main loop, exiting the main loop will terminate the container.

When overriding scripts, you need to determine whether or not you are going to invoke the default behavior. Some script overrides (like SetupCertificate.ps1) will typically not invoke the default script, others (like SetupConfiguration.ps1) typically will invoke the default behavior.

Insert this line in your script to invoke the default behavior of the script:

. (Join-Path $runPath $MyInvocation.MyCommand.Name)

Furthermore, within your scripts there are a number of variables you can/should use. The $restartingInstance is important to consider as this determines whether the container is restarting or starting for the first time. A number of things are NOT needed to be done again on restart.

  • $restartingInstance – this variable is true when the script is being run as a result of a restart of the docker instance.

The following variables are used to indicate locations of stuff in the image:

  • $runPath – this variable points to the location of the run folder (C:\RUN)
  • $myPath – this variable points to the location of my scripts (C:\RUN\MY)
  • $NavDvdPath – this variable points to the location of the NAV DVD (C:\NAVDVD)

The following variables are parameters, which are defined when running the image:

  • $Auth – this variable is set to the NAV authentication mechanism based on the environment variable of the same name. Supported values at this time is Windows and NavUserPassword.
  • $serviceTierFolder – this variable is set to the folder in which the Service Tier is installed.
  • $WebClientFolder – this variable is set to the folder in which the Web Client binaries are present.
  • $roleTailoredClientFolder – this variable is set to the folder in which the RoleTailored Client files are present.

Please go through the navstart.ps1 script to understand how this works and how the overridable scripts are launched.

navstart.ps1

navstart.ps1 is the main script runner, which will invoke all the other scripts.

Default behavior

Invoke scripts in the order mentioned above.

Reasons to override

  • If you want to change behavior of the NAV container totally, then this can be done by specifying another navstart.ps1.

Helperfunctions.ps1

HelperFunctions is a library of helper functions, used by the scripts.

Default behavior

You should always invoke the default helperfunctions script.

Reasons to override

  • Override functions in helperfunctions.

SetupVariables.ps1

When running the NAV container image, most parameters are specified by using -e parameter=value. This will actually set the environment variable parameter to value and in the SetupVariables script, these environment variables are transferred to PowerShell variables.

Default behavior

The script will transfer all known parameters from environment variables to PowerShell variables, and make sure that default values are correct.

Reasons to override

  • Hardcode variables

SetupDatabase.ps1

The responsibility of SetupDatabase is to make sure that a database is ready for the NAV Service Tier to open. The script will not be executed if a $databaseServer and $databaseName parameter is specified as environment variables.

Default behavior

The script will be executed when running the generic or a specific image, and it will be executed when the container is being restarted. The default implementation of the script will perform these checks:

  1. If the container is being restarted, do nothing.
  2. If an environment variable called bakfile is specified (either path+filename or http/https) that bakfile will be restored and used as the NAV Database.
  3. If an environment variables called appBacpac and tenantBacpac are specified (either path+filename or http/https) they will be restored and used as the NAV Database.
  4. If database credentials are specified, then the script will setup connection to an external SQL Server and setup key for encryption.
  5. If multitenant switch is specified, the NAV container will switch to multi-tenancy mode.

Reasons to override

  • Place your database file on a file share on the Docker host
  • Connect to a SQL Azure Database or another SQL Server

SetupCertificate.ps1

The responsibility of the SetupCertificate script is to make sure that a certificate for secure communication is in place. The certificate will be used for the communication between Client and Server (if necessary) and for securing communication to the Web Client and to Web Services (unless UseSSL has been set to N).

The script will only be executed during run (not build or restart) and the script will not be executed if you run Windows Authentication unless you set UseSSL to Y and you would typically not need to call the default SetupCertificate.ps1 script from your script.

The script will need to set 3 variables, which are used by navstart.ps1 afterwards.

  • $certificateCerFile (if self signed)
  • $certificateThumbprint
  • $dnsIdentity

Default behavior

The default script will create a self-signed certificate, and use this for securing access to NAV.

Note, services like PowerBI and the Office Excel Add-in will not be able to trust your self signed certificate, meaning that

Reasons to override

  • Using a certificate issues by a trusted authority.

SetupConfiguration.ps1

The SetupConfiguration script will setup the NAV Service Tier configuration file. The script also adds port reservations if the configuration is setup for SSL.

Default behavior

Configure the NAV Service Tier with all instance specific settings. Hostname, Authentication, Database, SSL Certificate and other things, which changes per instance of the NAV container.

Reasons to override

  • Changes needed to the settings for the NAV Service Tier (although this can also be done by specifying the CustomNavSettings environment variable in the container)

Example:

# Invoke default behavior
. (Join-Path $runPath $MyInvocation.MyCommand.Name)
$CustomConfigFile = Join-Path $ServiceTierFolder "CustomSettings.config"
$CustomConfig = (Get-Content $CustomConfigFile)
$customConfig.SelectSingleNode("//appSettings/add[@key='MaxConcurrentCalls']").Value = "10"
$CustomConfig.Save($CustomConfigFile)

SetupAddIns.ps1

SetupAddIns must make sure that custom add-ins are available to the Service Tier and in the RoleTailored Client folder.

Default Behavior

Copy the content of the C:\Run\Add-ins folder (if it exists) to the Add-ins folder under the Service Tier and the RoleTailored Client folder.

If you override this script, you should execute the default behavior before doing what you need to do. In your script you should use the $serviceTierFolder and $roleTailoredClientFolder variables to determine the location of the folders.

Note, you can also share a folder with Add-Ins directly to the ServiceTier Add-Ins folder and avoid copying stuff around altogether.

Reasons to override

  • Copy Add-Ins from a network location

SetupLicense.ps1

The responsibility of the SetupLicense script is to ensure that a license is available for the NAV Service Tier.

Default Behavior

The default behavior of the setupLicense script does nothing during restart of the Docker instance.

Else, the default behavior will check whether the LicenseFile parameter is set (either to a path on a share or a http download location). If the licenseFile parameter is specified, this license will be used. If no licenseFile is specified, then the CRONUS Demo license is used.

In all specific NAV container images, the license is already imported. If you are running the generic image, the license will be imported.

Reasons to override

  • If you have moved the database or you are using a different database
  • Import the license to a different location (default is NavDatabase)

SetupTenant.ps1

This script will create a tenant database as a copy of the tenant template database.

Default behavior

Copy the tenant template database and mount it as a new tenant.

Reasons to override

  • Use a different tenant template.
  • Initialize tenant after mount.

SetupWebClient.ps1

This script is used to setup the WebClient. This script is different for different versions of NAV.

Default behavior

Setup the WebClient under IIS.

Reasons to override

  • If you want to setup the WebClient as a service and not under IIS.

SetupWebConfiguration.ps1

The responsibility of the SetupWebConfiguration is to do final configuration changes to Web config.

Default Behavior

The default script is left empty, base Web Configuration is done in SetupWebClient.ps1.

Reasons to override

  • Change things in the Web configuration, which isn't supported by parameters already.

SetupFileShare.ps1

The SetupFileShare script needs to copy files, which you want to be available to the user to the file share folder.

Default Behavior

  • Copy .vsix file (NAV new Development Environment add-in) if it exists to file share folder.
  • Copy self-signed certificate (if you are using SSL) to file share folder.

You should always invoke the default behavior if you override this script (unless the intention is to not have the file share).

Reasons to override

  • Add additional files to the file share (Copy files need to $httpPath)

SetupWindowsUsers.ps1

This script will create the user specified as a Windows user in the container in order to allow Windows authentication to work.

Default behavior

Create the Windows user.

Reasons to override

  • avoid creating the Windows user.

SetupSqlUsers.ps1

The SetupSqlUsers script must make sure that the necessary users are created in the SQL Server.

Default Behavior

  • If the databaseServer is not localhost, then the default behavior does nothing, else…
  • If a password is specified, then set the SA password and enable the SA user for classic development access.
  • If you are using NavUserPassword authentication, then add the user to the SQL Database as a sysadmin.
  • If you are using windows authentication and gMSA, then add the user to the SQL Database as a sysadmin.

Note, using NavContainerHelper, you will not be able to avoid specifying a username and password.

If you override this script, you might or might not need to invoke the default behavior.

Reasons to override

  • Change configurations to SQL Server

SetupNavUsers.ps1

The responsibility of the SetupNavUsers script is to setup users in NAV.

Default Behavior

If the container is running Windows Authentication, then this script will create the current Windows User as a SUPER user in NAV. This script will also create the LocalUser if necessary you have specified username and password (i.e. if you are NOT using gMSA). If the user already exists in the database, no action is taken.

If the container is running NavUserPassword authentication, then this script will create a new SUPER user in NAV. If Username and Password are specified, then they are used, else a user named admin with a random password is created. If the user already exists in the database, no action is taken.

Note, using NavContainerHelper, you will not be able to avoid specifying a username and password.

If you override this script, you might or might not need to invoke the default behavior.

Reasons to override

  • Create multiple users in NAV for demo purposes

SetupClickOnce.ps1

The SetupClickOnce script will setup a ClickOnce manifest in the download area.

Default Behavior

Create a ClickOnce manifest of the Windows Client

Reasons to override

  • This script is rarely overridden, but If you want to create an additional ClickOnce manifest, this is where you would do it.

SetupClickOnceDirectory.ps1

The responsibility of the SetupClickOnceDirectory script is to copy the files needed for the ClickOnce manifest from the RoleTailored Client directory to the ClickOnce ApplicationFiles directory.

Default behavior

Copy all files needed for a standard installation, including the Add-ins folder.

If you override this script, you would probably always call the default behavior and then perform whatever changes you need to do afterwards. The location of the Windows Client binaries is given by $roleTailoredClientFolder and the location to which you need to copy the files is $ClickOnceApplicationFilesDirectory.

Reasons to override

  • Changes to ClientUserSettings.config
  • Copy additional files. If you need to copy additional files, invoke the default behavior and perform copy-item cmdlets like:

Example:

Copy-Item "$roleTailoredClientFolder\Newtonsoft.Json.dll" -Destination "$ClickOnceApplicationFilesDirectory"

AdditionalSetup.ps1

This script is added to allow you to add additional setup to your Docker container, which gets run after everything else is setup. You will see, that in the scenarios, the AdditionalSetup script is frequently overridden to achieve things.

Default Behavior

The default script is empty and does nothing. If you override this script there is no need to call the default behavior.

This script is the last script, which gets executed before the output section and the main loop.

Reasons to override

  • If you need to perform additional setup when running the docker container

AdditionalOutput.ps1

This script is added to allow you to add additional output to your Docker container.

Default Behavior

The default script is empty and does nothing.

If you override this script there is no need to call the default behavior.

Reasons to override

If you need to output information to the user running the Docker Container, you can write stuff to the host in this script and it will be visible to the user running the container.

MainLoop.ps1

The responsibility of the MainLoop script is to make sure that the container doesn't exit. If no "message" loop is running, the container will stop running and be marked as Exited.

Default Behavior

Default behavior of the MainLoop is, to display Application event log entries concerning Dynamics products.

If you override the MainLoop, you would rarely invoke the default behavior.

Reasons to override

  • Avoid printing out event log entries
  • Override the MainLoop and sleep for a 100 years😊

Enjoy

Freddy Kristiansen
Technical Evangelist

NavContainerHelper – Setup CSIDE development environment with source code management

$
0
0

Most partners have different ways of setting up their CSIDE development environments and a number of partners are also using source code management to manage their source code. I have seen a few presentations on different ways of doing this and I will try to show how Docker and especially the NavContainerHelper can be used to setup a CSIDE development environment with source code management - very easily.

I will also cover how easy you can move your solution from one version of NAV to another and even how your C/AL solution can be moved to AL.

I know this blog post uses a very simple solution and my view on everything is fairly simplistic, but if your code is written using an event based architecture (called step 4 in this blog post), then it actually doesn't have to be much harder than this...

Install Docker and NavContainerHelper

In order for this to work, you need to setup a development machine with Docker and NavContainerHelper as described in this blog post.

Note, this uses NavContainerHelper 0.2.7.3.

Fork my project

I have created a very simple project and placed it on GitHub. You will find it here: https://github.com/NAVDEMO/MyFirstApp

Go ahead and fork the project to your own GitHub account and clone your project to your development machine. I use the GitHub Desktop Client found here: https://desktop.github.com/ and after this, I have a folder with my project on my development machine like this:

and if you look in the Source folder:

Open PowerShell ISE as administrator and load the CreateDevEnv.ps1 file.

$mylicense = "c:\temp\mylicense.flf"
$imageName = "microsoft/dynamics-nav:2017-cu13"
$sourceFolder = Join-Path $PSScriptRoot "Source"
$containerName = Split-Path $PSScriptRoot -Leaf
New-NavContainer -accept_eula `
                 -containerName $containerName `
                 -imageName $imageName `
                 -auth Windows `
                 -licensefile $mylicense `
                 -updateHosts `
                 -includeCSide `
                 -additionalParameters @("--volume ${sourceFolder}:c:\source") 
Import-DeltasToNavContainer -containerName $containerName -deltaFolder $sourceFolder -compile

This script assumes that you have a license file in c:\temp - please modify the line if needed.

The script will create a NAV container called MyFirstApp, using Windows authentication, including CSIDE and sharing the source folder to the container. You should see an output like this:

...
Container IP Address: 172.19.157.232
Container Hostname : MyFirstApp
Container Dns Name : MyFirstApp
Web Client : http://MyFirstApp/NAV/WebClient/

Files:

Initialization took 38 seconds
Ready for connections!
Reading CustomSettings.config from MyFirstApp
Creating Desktop Shortcuts for MyFirstApp
Nav container MyFirstApp successfully created
Copy original objects to C:\ProgramData\NavContainerHelper\Extensions\MyFirstApp\original for all objects that are modified (container path)
Merging Deltas from c:\source (container path)
Importing Objects from C:\ProgramData\NavContainerHelper\Extensions\MyFirstApp\mergedobjects.txt (container path)
Objects successfully imported
Compiling objects
Objects successfully compiled

Start CSIDE and develop your solution

On your desktop you will find a shortcut to MyFirstApp CSIDE. Start this, and modify your solution. Try to add another field to the customer table: "My 2nd Field" and save the object. You can do multiple modifications to multiple objects and when you want to check in your modifications to GitHub, run the GetChanges.ps1 script, which looks like this:

$sourceFolder = Join-Path $PSScriptRoot "Source"
$containerName = Split-Path $PSScriptRoot -Leaf
Export-ModifiedObjectsAsDeltas -containerName $containerName -deltaFolder $sourceFolder

Now, switch to the GitHub Desktop app, which will show the modifications:

and you can check these into the depot if needed.

After checkin, you might get changes from other developers. You might also have decided to discard some changes, meaning that your source folder is different from what you have in the development environment database.

Now simply re-run the CreateDevEnv.ps1 script to re-create your development environment based on the source folder. This only takes 1-2 minutes.

When you are done working on the project, simply remove the container, using the RemoveDevEnv.ps1 script, which looks like:

$containerName = Split-Path $PSScriptRoot -Leaf
Remove-NavContainer -containerName $containerName

Note, that you cannot re-create or remove the container if you have CSIDE or other files in the container open from the host.

But..., I have .net add-ins!

If you have .net add-ins, that your solution depends on, you can place those in a folder and share this folder to the container as c:\run\add-ins, meaning that CreateDevEnv.ps1 now looks like:

$mylicense = "c:\temp\mylicense.flf"
$imageName = "microsoft/dynamics-nav:2017-cu13"
$sourceFolder = Join-Path $PSScriptRoot "Source"
$containerName = Split-Path $PSScriptRoot -Leaf
$addInsFolder = "C:\temp\addins"
New-NavContainer -accept_eula `
                 -containerName $containerName `
                 -imageName $imageName `
                 -auth Windows `
                 -licensefile $mylicense `
                 -updateHosts `
                 -includeCSide `
                 -additionalParameters @("--volume ${sourceFolder}:c:\source",
                                         "--volume ${addInsFolder}:c:\run\Add-Ins")
Import-DeltasToNavContainer -containerName $containerName -deltaFolder $sourceFolder -compile

All files in the c:\run\add-ins folder in the container will automatically be copied to the Add-ins folder in the Service folder and in the RoleTailored Client folder, for you to use when doing development.

But..., I need to change some configuration settings!

If your solution depends on the Task Scheduler (which by default is not enabled in Docker images), then you normally would need to set the EnableTaskScheduler setting in CustomSettings.config and restart the service tier. This can also be done as part of running the container:

$mylicense = "c:\temp\mylicense.flf"
$imageName = "microsoft/dynamics-nav:2017-cu13"
$sourceFolder = Join-Path $PSScriptRoot "Source"
$containerName = Split-Path $PSScriptRoot -Leaf
$addInsFolder = "C:\temp\addins"
New-NavContainer -accept_eula `
                 -containerName $containerName `
                 -imageName $imageName `
                 -auth Windows `
                 -licensefile $mylicense `
                 -updateHosts `
                 -includeCSide `
                 -additionalParameters @("--volume ${sourceFolder}:c:\source",
                                         "--volume ${addInsFolder}:c:\run\Add-Ins",
                                         "--env CustomNavSettings=EnableTaskScheduler=true")
Import-DeltasToNavContainer -containerName $containerName -deltaFolder $sourceFolder -compile

You will see during initialization of the container, that the settings are transferred:

Modifying NAV Service Tier Config File with Instance Specific Settings
Modifying NAV Service Tier Config File with settings from environment variable
Setting EnableTaskScheduler to true
Starting NAV Service Tier

and the Task Scheduler will be running.

But..., I have other needs!

In general, the idea is that CreateDevEnv.ps1 should setup an environment that matches your solution again and again. The extensibility model of the NAV container allows you to dynamically override scripts, upload files, apply settings and much more.

If you are unable to setup a development environment like this for your solution, I would very much like to hear about it. Create an issue on the issues list on navcontainerhelper and I will see whether it is possible to fix this.

What if I want to run my code in NAV 2018

I know this is a small solution and it is never as easy as it is here, but anyway.

Modify the imageName in CreateDevEnv.ps1 to

$imageName = "microsoft/dynamics-nav:2018"

and run the script.

It might take some time if you haven't pulled the NAV 2018 image yet, but once the image is downloaded, the time should be the same as with NAV 2017.

Now run GetChanges.ps1 to see that a few other things was changed by moving the solution to NAV 2018.

What if I want to move my solution to AL and VS Code

Now, you have got the hang of it, you are spinning up containers and living on the edge, but you want more... - you want to move your solution to AL.

In order to move the solution to AL, we need to import the changes to NAV 2018 or later and convert the modified objects to AL.

Modify the imageName in CreateDevEnv.ps1 to

$imageName = "microsoft/dynamics-nav:2018"

and run the script.

You should see some info about Dev. Server in the output, which you should note down.

Container Hostname : MyFirstApp
Container Dns Name : MyFirstApp
Web Client : http://MyFirstApp/NAV/
Dev. Server : http://MyFirstApp
Dev. ServerInstance : NAV

Files:
http://MyFirstApp:8080/al-0.12.17720.vsix

Initialization took 45 seconds

Also you should download the .vsix file to your host from the container and install this in Visual Studio Code.

After this, run this script

Convert-ModifiedObjectsToAl -containerName $containerName -startId 50100 -openFolder

In the same instance of ISE for the $containerName variable to be set.

and you should get a folder with your AL files.

Now you can start VS Code, make sure the .vsix extension is the right version and press Ctrl+Shift+P and select AL Go!

Select local server and modify launch json to use the Dev. Server and Dev. Server Instance described in the container output.

Also set the authentication to Windows, copy the AL files to the folder and you are on your way to do Extensions v2 development...

Enjoy

Freddy Kristiansen
Technical Evangelist


NAV on Docker 0.0.5.5 or…– What’s new

$
0
0

As some users of NAV on Docker has noticed, the images gets rebuild from time to time. We typically rebuild all images when we have changes to the generic layer, which might be of value to users of NAV on Docker. This blog post describes what's new since the last blog post on 0.0.4.1 (December 2nd 2017).

A lot of small improvements has happened, but for most users, the changes aren't really visible, if you "just" use NAV containers for development or test.

0.0.5.5 - 2018.03.29

Remove tenant data from the App database. For performance reasons, we left the tenant part in the App database when switching to multitenancy. This however made Export-NavContainerDatabasesAsBacpac create wrong bacpac files.

Test Assemblies, used for the performance test framework are included in images built on generic 0.0.5.5 or newer. They are located in C:\Test Assemblies inside the container.

Task Scheduler is now enabled by default for developer preview and Business Central Sandboxes.

TestToolkit and Test objects for localized developer preview and localized business central sandbox containers in C:\TestToolKit.

0.0.5.4 - 2018.03.24

In order to stay with the new naming strategy of the microsoft dotnet images, NAV containers are now using microsoft/dotnet-framework:4.7.1-windowsservercore-ltsc2016 as base image.

TenantEnvironmentType is set to Sandbox for Business Central Sandboxes and new tenants are mounted as sandbox tenants.

Container Age was checked on restart of the container, meaning that a container, where the age exceeds 90 days will be unable to start once stopped.

0.0.5.3 - 2018.02.26

Generate symbols and setup NAV for dual development between AL and C/AL.

Restore bakfile to seperate folder to avoid conflict with the database in the container.

0.0.5.2 - 2018.02.23

Support for Azure SQL in the ARM template (http://aka.ms/getnavext), which allows you to deploy bacpacs to Azure SQL as part of your Azure Resource Manager Template.

Support for TLS 1.2, which has become a requirement for github and other download places.

Bugfix: ClickOnce failure without AcsUri defined.

0.0.5.1 - 2018.02.17

AAD support for Windows Client using ClickOnce

0.0.5.0 - 2018.02.15

Support for Azure Active Directory (AAD) authentication

0.0.4.5 - 2018.02.03

Fail fast if the amount of memory assigned to the NAV container isn't at least 3Gb.

Support for multitenancy.

0.0.4.4 - 2017.12.19

Support for sharing URLs as folders to containers to allow for script overriding in Azure Container Services and more.

Bugfix: Report preview not working due to missing t2embed.dll

0.0.4.3 - 2017.12.09

Include upgradetoolkit and extensions from the DVD on the container.

0.0.4.2 - 2017.12.04

Support for specifying custom config settings for Service Tier, Web Client on Windows Client as a parameter for Docker. Example: --env CustomNavSettings=EnableTaskScheduler=true

Split the install scripts to version specific folders to simplify source code.


 

Alongside all of these changes, the navcontainerhelper PowerShell module has also been updated together with the NAV ARM Templates to give a better experience when deploying test, development and demo environments.

Enjoy

Freddy Kristiansen
Technical Evangelist

Who are you following?

$
0
0

Update april 12th - added a few to my list - have more reading to do:-)

Dynamics 365 Business Central and Dynamics NAV is part of a vibrant community with a lot of very active people. Tweets, blog posts, webinars, books, github projects could easily fill up my day just trying to follow everybody who are working on our product. This blog posts lists some of the people/sites you can follow if you, like me, are passionate about Dynamics 365 Business Central and Dynamics NAV.

NAV Team Blog

If you only follow one blog (beside my blog :-)), it should IMO be the NAV Team blog.

https://blogs.msdn.microsoft.com/nav/

@mpdynamics365

If you only follow one person on twitter (beside @freddydk :-)), it should IMO be Marko @mpdynamics365.

People i follow on twitter - and their blogs...

I follow a lot of people on twitter and the following list is not exhaustive, but these guys also have blogs where the main goal seems to be to spread their knowledge about our common favorite product. No political BS, no non-sense, just pure love for our product and as a wise man once said, the only thing that doubles when it is shared is knowledge.

In alphabetic order:

Please let me know if I should be following your blog as well?

Other important Microsoft people to follow

A few extra Microsoft people who I get value of following:

The full list of people I follow on twitter

The complete list of the people I follow can be found here:

https://twitter.com/freddydk/following

Please let me know if I should be following you?

Conferences, forums and webinars

Besides twitter and blogs, we have the conferences, the forums and the webinars. Here are the ones that I follow:

Please let me know if I forgot any?

Facebook, LinkedIn, Instagram, Pinterest,...

Personally, I use Facebook for personal stuff only. I typically do not share work related stuff on Facebook, but I do have a number of friends from the community as friends on Facebook. I haven't really started to use LinkedIn, Instagram, Pinterest or the various chat apps for anything, but it is something I will be investigating.

 

Enjoy

Freddy Kristiansen
Technical Evangelist

Enabling Premium Experience in Business Central Sandbox Containers

$
0
0

A few people have discovered that manufacturing, production and other functionality which only is available when using Premium Plan is not available when running a Business Central Sandbox Container.

The reason for this is, that this is controlled by the User Plan and by default the admin user has the essential plan. In Business Central, the plan is something that is controlled by what plan you purchase and you will not be able to add or modify records in the User Plan table.

Business Central Sandbox Containers are for development and test and of course we need to be able to develop and test against Premium - but it is also important to be able to run with essentials.

This blog post will describe how to assign the premium plan to your default super user in the NavContainer. It will also describe how you can create a number of test users and assign user groups and permissions to these users, so that you can test your app using the different users.

Username             User Groups              Permission Sets
EXTERNALACCOUNTANT   D365 EXT. ACCOUNTANT     D365 BUS FULL ACCESS
                     D365 EXTENSION MGT       D365 EXTENSION MGT
                                              D365 READ
                                              LOCAL
PREMIUM              D365 BUS PREMIUM         D365 BUS PREMIUM
                     D365 EXTENSION MGT       D365 EXTENSION MGT
                                              LOCAL
ESSENTIAL            D365 BUS FULL ACCESS     D365 BUS FULL ACCESS
                     D365 EXTENSION MGT       D365 EXTENSION MGT
                                              LOCAL
INTERNALADMIN        D365 INTERNAL ADMIN      D365 READ
                                              LOCAL
                                              SECURITY
TEAMMEMBER           D365 TEAM MEMBER         D365 READ
                                              D365 TEAM MEMBER
                                              LOCAL
DELEGATEDADMIN       D365 EXTENSION MGT       D365 BASIC
                     D365 FULL ACCESS         D365 EXTENSION MGT
                     D365 RAPIDSTART          D365 FULL ACCESS
                                              D365 RAPIDSTART
                                              LOCAL

and... - I will describe how to do this, whether you use Azure VMs, navcontainerhelper or docker run.

Azure VMs

If you use http://aka.ms/bcsandbox to create your Business Central Sandbox Container Azure VM, you will find two new options in the Azure Resource Manager template, which by default are set to yes.

The first option is whether or not your admin user should be assigned a premium plan. The second is whether or not you want the setup to include the test users described above - that's it - by default you get premium plan and test users, as of today.

NavContainerHelper

If you are using New-NavContainer to create your Business Central Sandbox Container, you should upgrade to version 0.2.8.3.

Now you will have a new switch called assignPremiumPlan on New-NavContainer, use it like this:

New-NavContainer -accept_eula -assignPremiumPlan -containerName test -imageName microsoft/bcsandbox

Adding this option will assign the premium plan to your default admin user. Internally this just adds a record to the User Plan table.

In order to create the test users you will have to call a function called

Setup-NavContainerTestUsers containerName test -tenant default -password $securePassword

and specify the container and the password you want to use for the new users.

Internally, the Setup-NavContainerTestUsers downloads a codeunit with ID=50000, imports it and run an external function called CreateTestUsers with the password needed. After this you can delete or overwrite the codeunit, it is not needed anymore. The implementation might change.

If you want to see the codeunit, or if you need to modify the codeunit for your needs, you can download it at http://aka.ms/createtestusersfob.

Docker run

When you are using docker run to run your containers, you have a little more work to do.

First of all, you need to override the SetupNavUsers.ps1 by sharing a local folder to c:\run\my in the container and place a file called SetupNavUsers.ps1 in that folder with this content:

# Invoke default behavior
. (Join-Path $runPath $MyInvocation.MyCommand.Name)
 
Get-NavServerUser -serverInstance NAV -tenant default |? LicenseType -eq "FullUser" | % {
    $UserId = $_.UserSecurityId
    Write-Host "Assign Premium plan for $($_.Username)"
    sqlcmd -S 'localhost\SQLEXPRESS' -d $DatabaseName -Q "INSERT INTO [dbo].[User Plan] ([Plan ID],[User Security ID]) VALUES ('{8e9002c0-a1d8-4465-b952-817d2948e6e2}','$userId')" | Out-Null
}

This will assign the premium plan to the admin user in the database.

In order to setup test users, you should download the codeunit from http://aka.ms/createtestusersfob import it using the classic development environment and run the CreateTestUsers function in the codeunit with the password you want to set for the users.

 

Enjoy

Freddy Kristiansen
Technical Evangelist

What Docker Image is right for you?

$
0
0

The last year has been quite a journey for people using Docker images for Microsoft Dynamics NAV or Dynamics 365 Business Central. Images have been available in various places and private registries, navdocker, developer preview, navinsider, microsoft/dynamics-nav are just some of the terms you have run into.

This blog post should demystify and explain clearly which Docker Image is the right for you in a given situation - and where to get it.

Developing for Microsoft Dynamics NAV

If you are developing for Microsoft Dynamics NAV, you will find Docker images on the public Docker Hub under microsoft/dynamics-nav.

In the public Docker hub, you will find all cumulative updates to NAV 2016, 2017 and 2018 in all country versions and you can use the images simply by specifying the right tag. The tagging strategy used in microsoft/dynamics-nav is:

microsoft/dynamics-nav:version-cu-country

where

  • version is 2016, 2017 or 2018 (default is 2018)
  • cu is rtm, cu1, cu2, ... (default is latest)
  • country is w1, dk, de, nl, na, ... (default is w1)

Image name examples:

microsoft/dynamics-nav
microsoft/dynamics-nav:2018-cu3-de
microsoft/dynamics-nav:2017
microsoft/dynamics-nav:2016-dk

You will also be able to run earlier versions of Microsoft Dynamics NAV using the generic image as explained here.

Developing for Dynamics 365 Business Central

If you are developing for the current version of Dynamics 365 Business Central, you will find Docker images on the public Docker Hub under microsoft/bcsandbox.

You will find the current version of Dynamics 365 Business Central using

microsoft/bcsandbox:build-country

where

  • build is the build number (default is current version)
  • country is w1, dk, us, ca, de, ... (default is w1)

Image name examples:

microsoft/bcsandbox
microsoft/bcsandbox:us
microsoft/bcsandbox:dk
microsoft/bcsandbox:12.0.21229.0-us

You should normally never use the image with a specific build number unless instructed to do so. This is primarily used when you spin up Container Sandbox images from within Dynamics 365 Business Central (page search for Sandbox).

Maintaining an app in AppSource for Dynamics 365 Business Central

If you have published an app in AppSource, you should continuously test that the app works with the next version of Dynamics 365 Business Central. You will be able to get insider builds of Business Central from a private registry called bcinsider.azurecr.io and the credentials for this private registry is available through Microsoft Collaborate.

The insider builds are normally updated daily and the Dynamics 365 Business Central servers are updated monthly to this version. When the Dynamics 365 Business Central servers are updated, the image will also be deployed on the public docker hub under microsoft/bcsandbox (see previous section).

The image name follows the same tagging strategy as the public Dynamics 365 Business Central images:

bcinsider.azurecr.io/bcsandbox:build-country

where

  • build is the build number (default is latest version)
  • country is w1, dk, us, ca, de, ... (default is w1)

Image name examples:

bcinsider.azurecr.io/bcsandbox
bcinsider.azurecr.io/bcsandbox:us
bcinsider.azurecr.io/bcsandbox:dk
bcinsider.azurecr.io/bcsandbox:12.1.21581.0-nl

You should never use the image with a specific build number unless instructed to do so. Setup Continuous Integration and Continuous Deployment by pulling the daily update of the Dynamics 365 Business Central Sandbox Container image.

Developing for a future release of Dynamics 365 Business Central

Much like the strategy for Windows and other Microsoft services, Dynamics 365 Business Central will receive major updates semi-annually. If you are developing an app for AppSource targetting the next major update or if you need cutting edge functionality directly from the lab, you will be able to get insider builds of Business Central from a private registry called bcinsider.azurecr.io and the credentials for this private registry is available through Microsoft Collaborate.

The insider builds are normally updated daily and the Dynamics 365 Business Central servers are updated semi annually to this version. When the Dynamics 365 Business Central servers are updated, the image will also be deployed on the public docker hub under microsoft/bcsandbox and will receive updates monthly (see previous section).

Please be aware that these insider builds might be more unstable than builds from the previous sections. You might see new functionality being developed over multiple days and upgrade procedures between versions might not be working smoothly. Please only use builds from this branch if you have a reason to do so.

The image name follows the same tagging strategy as the public Dynamics 365 Business Central images, but with a different namespace:

bcinsider.azurecr.io/bcsandbox-master:build-country

where

  • build is the build number (default is latest version)
  • country is w1, dk, us, ca, de, ... (default is w1)

Image name examples:

bcinsider.azurecr.io/bcsandbox-master
bcinsider.azurecr.io/bcsandbox-master:us
bcinsider.azurecr.io/bcsandbox-master:dk
bcinsider.azurecr.io/bcsandbox-master:12.1.21581.0-nl

You should never use the image with a specific build number unless instructed to do so.

Support?

If you encounter issues with Microsoft Dynamics NAV or with the current release of Dynamics 365 Business Central, you must report issues through the Dynamics Support team. You can open Support Request to CSS through PartnerSource portal (https://mbs2.microsoft.com/Support/SupportRequestStep1.aspx) or contact your Service Account Manager (SAM) in the local subsidiary to understand what is included in your contract as of support incident and PAH (Partner Advisory Hours). Your SAM might also direct you step-by-step how to open a support request or how to get credentials if this is the first time you or your company are engaging Support.

If you encounter issues which are specific to the insider builds of Dynamics 365 Business Central, you should report these on Github AL issues.

In the near future, there will be a blog post on the NAV team blog explaining the above in more detail.

If you have issues running the simplest NAV on Docker container (docker run -e accept_eula=Y -m 3G microsoft/dynamics-nav) you should troubleshoot your infrastructure. A lot of frequently encountered issues can be solved be reading this blog post. You can also download a Container Host Debug PowerShell script here: http://aka.ms/debug-containerhost.ps1 to troubleshoot issues with the container host.

If you have issues running NAV on Docker or Business Central Sandbox Containers, which you think might be related to problems in the Container images, please report these on Github nav-docker issues.

If you have issues running NAV on Docker or Business Central Sandbox Containers using navcontainerhelper, which you think might be related to problems in navcontainerhelper, please report these on Github navcontainerhelper issues.

If you have issues running NAV on Docker or Business Central Sandbox Containers in Azure VMs using the ARM templates (http://aka.ms/getnav, http://aka.ms/bcsandbox, etc.), which you think might be related to problems in the ARM templates, please report these on Github nav-arm-templates issues.

 

Enjoy

Freddy Kristiansen
Technical Evangelist

A “new” learning opportunity – the Hackathon at NAVUG Focus 18

$
0
0

I know, a Hackathon is not something new. Hackathons have existed at least half a decade, but how can a Hackathon be a learning opportunity?

Typically we see a Hackathon as an event where people get together to create some prototype or proof of concept of an idea, but the Wikipedia description of a Hackathon is actually as simple as:

“A hackathon, a hacker neologism, is an event when programmers meet to do collaborative computer programming.”

The idea

A few months ago, Mark (@GatorRhodie) contacted me and told me about NAVUG FOCUS 18. He told me that one of the themes of this years NAVUG FOCUS is "A Brave New World" - about AL development, Docker, VS Code, Azure etc.

He told me that he wanted to conduct a Hackathon during the event. We had a few calls and discussed various approaches and ended up agreeing that the best approach would be to create some ideas/challenges, which people can work on in groups if they don't have ideas of their own. The challenges should be things of common usage, things that people can go back and look at as a reference on how to do things.

We brainstormed some ideas and with great help of Jesper (@JesperSchulz) we ended up with a set of challenges, which we think are appropriate for the event.

The event

The event takes place on Monday, may 21st evening from 5:30PM to 11PM (not sure how my jetlag is going to cope with that:-)) and the idea is that people can choose one or more of "our" challenges to work on - or they can work on ideas of their own.

For every challenge there is a description, an expected result, some steps, some hints and some cheat sheets. We will have some people in the room to help out if people get stuck, but the primary idea is, that people help each other. People working on "our" challenges can request a cheat sheet if they cannot figure out how to solve a specific issue.

Depending on the outcome of this event, we might use the same mechanism at other conferences. I am also considering whether our challenges can be made public somehow so that people can conduct their own Hackathon events for social learning/programming.

A sample challenge

Below, you will find one of the challenges in its full form (but without the cheat sheets). This challenge is a level 1 challenge.

Auto-fill company information on the customer card

As a new customer is entered in Dynamics 365 Business Central, the user can decide to enter a domain name instead of the name, which leads to the system looking up the information for the company associated with this domain name from a Web Service and filling out the remaining fields on the customer card with information obtained from the Web Service.

To complete this challenge, you will need:

  • A Dynamics 365 Business Central Sandbox Environment
  • Visual Studio Code with the AL Extension installed
    • Azure VMs will have VS Code pre-installed
  • An API Key from http://www.fullcontact.com

Expected result:

Steps:

  • Create an empty app
  • Create a page extension for the customer card
  • On the OnAfterValidate trigger on the Name field, check whether the entered value is a domain name
  • Ask the user whether he wants to lookup information about the company associated with this domain name
  • Call the fullcontact Web API and assign field values

Hints:

  • In VS Code, use Ctrl+Shift+P and type AL GO and remove the customerlist page extension
  • Use the tpageext snippet
  • Use EndsWith to check whether the name is a domain name
  • Use the Confirm method to ask whether the user want to download info
  • Use HttpClient to communicate with the Web Service
  • Use Json types (JsonObject, JsonToken, JsonArray and JsonValue) to extract values from the Web Service result

Cheat Sheets:

  • Create an empty app
  • Create a page extension
  • Code for communicating with Web Service
  • Update the customer

 

See you in Indianapolis.

 

Enjoy

Freddy Kristiansen
Technical Evangelist

The hackathon at UG FOCUS 18

$
0
0

As many of you know, we conducted a Hackathon at the UG FOCUS in Indianapolis this week.

“A hackathon, a hacker neologism, is an event when programmers meet to do collaborative computer programming.”

It all started a little over a month ago, where Mark Rhodes from the User Group called me and asked if I wanted to help out to conduct a hackathon. We had a few calls and decided to use this as a learning environment, where we put people who had never ever used VS Code and Business Central in a room with some ideas/challenges they could try out (you can find the challenges here: https://blogs.msdn.microsoft.com/hackathonchallenges/)

The Hackathon

During the hackathon, some people were sitting in groups of 3 or 4 and some people were working alone (we didn’t force anything). It was amazing to see how the groups were challenging each other to NOT go for the cheat sheets and used blogs, docs and brainstorming to find a solution. Exactly what I was hoping for. They would also call for help and ask questions before asking for the password for a cheat sheet. The people working alone got stuck faster – and when they couldn’t find a way out, were faster to request a cheat sheet password.

The Results

On the second day (the very last session), people could show their progress and @DaveHatker stood up – demoing and showing that they completed 3½ of the 5 challenges and not only could he demo what they had done – he could explain how it was done and why it worked. He was talking about table extensions, page extensions, snippets, user controls, control addins etc. as if he was an experienced trainer and in reality, he had only been exposed to this less than 24 hours. It was also clear when looking at their code, that they had NOT seen the cheat sheets. In fact one of their solutions was smarter than mine – I learned something new just there…😊

I asked David to share a few words of his own about the Hackathon: “Dynamics 365 Business Central can do just about anything, it seems.  So when I heard that Mark Rhodes and Freddy Kristiansen had put together a hackathon, specifically, the hackaNAVathon at Focus 2018, I was excited to see what was in store for us!  For the very first one of these ever done, I thought the hackathon was put together very thoughtfully and even better, every participant was provided a virtual machine with everything we needed to accomplish our challenges.   The challenges were fun, practical, and built upon themselves to allow us to get familiar with all of the ins and outs of Visual Studio Code and the AL language.  The hackathon is definitely something I’d recommend to both the professional and novice NAV developer.  It’s a great way to see how you can accomplish almost anything with Dynamics 365 Business Central!”

What went really well

Mark and the NAVUG group went over and beyond to make the Hackathon a success, badges, T-Shirts, Pizza, Room, and pre-signed enthusiastic people, ready to hack...

Groups of 3-4 people worked really well and would tend to get much further before going for a cheat sheet, even though the passwords for the cheat sheets was available on the screen. Different ideas on how to proceed comes up, searching documentation, blogs, samples etc. These teams was able to complete challenges without any assistance although this was the first time they ever coded in VS Code.

The Pizza - very good Chicago style Pizza. There is something about Pizza and Hackathon.

The Beer - the Scottish Style Ale from the Sun King brewery was really my kind of beer and although beer and Hackathon doesn't necessarily go together, it was very nice after a long day of conferencing.

The prepared challenges (see https://blogs.msdn.microsoft.com/hackathonchallenges) with steps and hints was crucial for getting some direction. What should people search for? It is much easier to figure out how to do things if you know what to look for.

What we learned

You shouldn't travel from Europe and arrive in the US Sunday evening and do a Hackathon Monday evening until 11pm:-)

You probably shouldn't have a Hackathon at the evening after a full day of conference. Days are long and full of learning

People working alone on the challenges tend to reach out for the cheat sheet much faster and even though this might get them to a result much faster, chances are that they don't always know why it is working. Groups of 3-4 people seems to learn better.

Definitely not the last time we are running a hackathon and I will recommend partners and customers to try this out, this is a fun, cheap and scalable way to learn.

The passwords for the hackathon cheat sheets are available at https://blogs.msdn.microsoft.com/hackathonchallenges (press Ctrl+A to reveal), but in order to get the most out of this - do run the hackathon in groups and learn from our learnings.

We will create more challenges and you are also more than welcome to contribute with new challenges for the community - don't be shy.

 

Happy coding

Freddy Kristiansen
Technical Evangelist

AAD authentication, Edit In Excel, Embedded PowerBi and http://aka.ms/GETNAV

$
0
0

I do not have a count of how many time somebody have asked me, e-mailed me, sent messages asking for the "old" NAVDEMODEPLOY with NAV 2018 or Business Central Sandbox Containers.

What people are alluding to is really this blog post - and I am happy to announce that as of today you can now do most of the steps on that list.

Only things missing is the SharePoint Portal and the Demo Apps. The idea is to create the SharePoint portal functionality as an App - so stay tuned...:-)

AAD authentication

For quite some time, it has been possible to spin up Azure VMs with AAD authentication, just by specifying your Office 365 admin credentials in the http://aka.ms/getnav template in these fields:

Behind the scenes, the ARM template would invoke a function in the navcontainerhelper called Create-AadAppsForNav. This function actually was the same function used in navdemodeploy to setup AAD, but recently people started to have problems with this. The function was built using the tools that were available back then (some AzureRM powershell module and the microsoft graph) which might not have been built for this. But... - it has served us well for years, may that code rest in piece.

Now we have the AzureAD Powershell module, which is built for this purpose and creating an AAD App for Web Client single signon requires only one line of code:

$ssoAdApp = New-AzureADApplication -DisplayName "NAV WebClient for $appIdUri" `
                                   -Homepage $publicWebBaseUrl `
                                   -IdentifierUris $appIdUri `
                                   -ReplyUrls $publicWebBaseUrl

Wow.

Note: For Business Central an additional replyUrl with SignIn added is required.

The Create-AadAppsForNav also used created an App for Edit In Excel and for embedded PowerBI and also that has become similarly easy. The code is available here.

So, all in all - the new method is more stable, faster and the code is easier to read.

Having the AzureAD PowerShell module also made it easy to create a function called Create-AadUsersInNavContainer, so of course that was added and of course a field was added to the http://aka.ms/getnav template:

Edit In Excel

As mentioned above, the AAD App for Edit In Excel is also created by the Create-AadAppsForNAV and as of today, the http://aka.ms/getnav ARM template will also configure Edit In Excel in  the NAV Container or the Business Central Sandbox Container. This is done by setting ExcelAddInAzureActiveDirectoryClientId in CustomSettings.config to the AdAppId field in the object returned from the New-AzureAdApplication creating the Excel Aad App.

Set-NAVServerConfiguration -ServerInstance nav -KeyName "ExcelAddInAzureActiveDirectoryClientId" -KeyValue "$ExcelAdAppId"

Having done this, the Open In Excel menu item in NAV or Business Central automagically changes to Edit In Excel:

PowerBI dashboards

With navdemodeploy, we could add another service tier, providing insecure web services access and through this create dashboards. With http://aka.ms/getnav, we can use LetsEncrypt and get a free 3 months trusted certificate, which works with Power BI. As of today, there is also a function in the navcontainerhelper to renew the certificate and get another 3 months - more about that in a seperate blog post.

The means that for Power BI dashboards, you just enter the OData Web Services URL and you will have the nicest dashboards:

No extra steps needed, nice:-)

Embedded PowerBI

For embedded PowerBI we have to configure the app in NAV / Business Central. This is done by adding the AppID and the Key to table 6300 (Azure Ad App Setup). There are several ways to do this in PowerShell, but I decided to reuse the mechanism from navdemodeploy and download a .fob file with a codeunit and invoke the codeunit from PowerShell.

In the navcontainerhelper, we have functions to help with this:

$fobfile = Join-Path $env:TEMP "AzureAdAppSetup.fob"
Download-File -sourceUrl "http://aka.ms/azureadappsetupfob" -destinationFile $fobfile
Import-ObjectsToNavContainer -containerName $containerName -objectsFile $fobfile -sqlCredential $sqlCredential
Invoke-NavContainerCodeunit -containerName $containerName -tenant "default" -CodeunitId 50000 -MethodName SetupAzureAdApp -Argument ($AdProperties.PowerBiAdAppId+','+$AdProperties.PowerBiAdAppKeyValue)

And of course this is automagically done when using http://aka.ms/getnav.

With that, you can now click the "Get Started with Power BI" link and you will get this:

Click the authorize and you should see:

Accept and you should see:

Now, you can click the drop down, select a report (if you have one), enable it and voila - embedded PowerBI.

NAV Containers or Business Central Sandbox Containers

Note, that everything in this blog post works with NAV Containers and with Business Central Sandbox Containers. When using http://aka.ms/getnav you just specify microsoft/bcsandbox:<country> in the NAV Docker Image field and your VM will become a Business Central Sandbox environment with AAD authentication, Edit in Excel and everything.

You can also use http://aka.ms/bcsandboxext which defaults to bcsandbox containers and have a few other settings - the end result is the same: An Azure VM with everything:-)

 

Enjoy

Freddy Kristiansen


NavContainerHelper 0.3.1.0 and a new Docker Generic build 0.0.6.6

$
0
0

Over the weekend, a new version of the NavContainerHelper (version 0.3.1.0) has been uploaded to the PowerShell Gallery with a few bug fixes and a few extra functions. Also a new version of the Generic image (microsoft/dynamics-nav:generic-0.0.6.6) has been published and all NAV images (2016, 2017 and 2018) are being rebuild.

Business Central Sandbox images are automatically based on the newest generic image. Older images are not rebuild.

Hosting endpoints on different ports

Up until now, the NAV Containers have always had fixed endpoints with the option of mapping these endpoints to different public ports on the host. A number of people found this hard to setup, and we have found out, that the Client Services Port (the port, which the Windows Client connects to) cannot be used with port mapping.

So, we decided to make it easier:-)

All docker images with generic build 0.0.6.5 or newer have some new parameters (environment variables), which allow you to change the endpoint port on the container. With this, you do not need to do any port mappings, instead you can just specify the port on which the container is listening.

The environment variables are:

WebClientPort defines the port on which the Web Client will be listening. Default is 80 or 443 (depending on whether or not you are using SSL).

ClientServicesPort defines the port on which the Service Tier is hosting Client Services (Windows Client connection port). Default is 7046.

SoapServicesPort defines the port on which the Service Tier is hosting Soap Services. Default is 7047.

ODataServicesPort defines the port on which the Service Tier is hosting OData Services and API Services. Default is 7048.

DeveloperServicesPort defines the port on which the Service Tier is hosting Developer Services (VS Code connection port). Default is 7049.

So, when using Docker Run, you will have to use:

--env ClientServicesPort=7146

to make the container listen for Client Services on another port.

If you are using New-NavContainer in NavContainerHelper, you can use

-ClientServicesPort 7146

You do not need to add this to the additionalParameters array.

The "old" way of using port mapping still works like before.

New Functions

Add-FontsToNavContainer will copy one or more fonts from the container host to a container. A number of fonts (including some Chinese and Korean fonts) are not included in WindowsServerCore and as such are missing when printing reports which includes characters in the languages. This function will copy the fonts from the container host and register them in the container.

Generate-SymbolsInNavContainer shouldn't really be needed, but it will re-generate all symbols in the container.

Time Zone

A strange bug (which must be in Docker) resulted in an error when trying to invoke a CodeUnit inside a Container. The reason was, that the Container seems to be started with a time zone, which didn't exist. Somehow the name and id of the timezone was translated into the local language, but when requesting time zones in the container,  the time zones were not translated. The original bug is here.

New generic image

The new generic image comes with a few updates, but most significantly is really the updated WindowsServerCore dependency. The new image comes with an updated WindowsServerCore and all images since NAV 2016 RTM are being rebuilt with this. This might be the last time where we automatically rebuild all images. In the future, we might just build new images on new versions of the generic layer.

Note that you can still run the old versions if you specify the accept_outdated flag.

Windows Server 2019 Insider Preview

Windows Server 2019 is available as Insider builds. Read more here.

All NAV images are still based on ltsc2016 (Windows Server Long Time Servicing Channel 2016) as this is the image, which is compatible with most operating systems. We will likely adopt the next ltsc release when it is out and build our images for both versions, which will also work with Windows 10 at that time.

The generic image is available for 1709 (microsoft/dynamics-nav:generic-0.0.6.6-1709) and 1803 (microsoft/dynamics-nav:generic-0.0.6.6-1803), but the individual images will not be build for these versions.

 

Enjoy

Freddy Kristiansen
Technical Evangelist

About time…

$
0
0

OK, OK - I should have started blogging a long time ago, I just never really got around to get going until now!


In the beginning my blog will probably be focused around NAV 2009, tips and tricks - but who knows where it might go.
Microsoft Dynamics NAV 2009 is set to be released to market in Q4 2008 - and through my work with a number of partners on NAV 2009, I do have a number of tips and tricks, which I think new partners could use.


I will also try to monitor sites like mibuso and navug in order to do my best to help the community on getting up to speed on NAV 2009 as quickly as possible.


Freddy Kristiansen


PM Architect


Microsoft Dynamics NAV


 

Multiple Service Tiers

$
0
0

NOTE – there is an updated post reg. Multiple Service Tiers in NAV 2009 SP1 here.

If you haven't done so, please read the post about the Service Tier before reading this:-)

A very typical scenario with both partners and customers is to have more than one database. This can be because you have a development database and a production database - or it could be the partner having a copy of all customer databases locally for troubleshooting.

You could of course install the Service Tier locally on all computers - and then change the CustomSettings.config to point to a new database every time you need to logon, but that doesn't really sound like something we want people to do.

The setup we would like to have is:

  • One or more SQL Server boxes with a bunch of databases on
  • One or more Service Tier boxes with at least one Service Tier pr. database
  • The Client installed locally on all machines being able to connect to these Service Tiers.

This post will go into detail about how to accomplish this.

The Simple Story!

To make a long story short - adding a Service Tier isn't any harder than copying the Service directory to another directory (maybe called test) and then registering a new service based on the executable in that folder using the SC command:

SC CREATE testServiceTier binpath= "C:\Program Files\Microsoft Dynamics NAV\60\test\Microsoft.Dynamics.Nav.Server.exe" start= auto obj= "NT Authority\NetworkService"

This would actually work if you change the CustomSettings.config to use a different port than 7046 - so why write a big post about it?

And why it isn't that simple after all!

You typically want to use the same port for all your Service Tiers - allowing you to distinguish them on the instance name, and since the default Service Tier doesn't have a dependency on NetTcpPortSharing - you cannot just start adding new ones.

You want a consistent naming algorithm for your Service Tiers, and you want to make sure, that if you create a new Service Tier, it doesn't inherit settings from one of the other Service Tiers by coincidence.

And last but not least, you often want to create a Web Service listener to sit next to your Service Tier (that is of course if you intend to use Web Services).

So - I created a bunch of .BAT files which would do the job for me. Feel free to look at the .BAT files, copy them, use them, modify them, but I do encourage you to send any improvements of the .BAT files to me, so that I can make them available to the community.

Note that I am NOT a .BAT file expert - but I did learn a LOT by creating these .BAT files.

The first 3 .BAT files I created are called:

CreateService.bat, DeleteService.bat and RecreateOriginalService.bat

I think the names speaks for themselves. These .BAT files then have dependencies on another .BAT file, a .VBS script and a new CustomSettings.template - all of these will be included in this post (I hope you are not in a hurry)

The .BAT files needs to be placed in the NAV installation directory (which typically would be C:\Program Files\Microsoft Dynamics NAV\60\) - and the very first thing you want to do, is to run the RecreateOriginalService.bat.

RecreateOriginalService.bat

@ECHO OFF
IF NOT "%1" == "" GOTO usage
SET NAVPATH=%~dp0
IF EXIST "%NAVPATH%service\Microsoft.Dynamics.Nav.Server.exe" GOTO NavPathOK
ECHO.
ECHO Unable to locate installation service directory
ECHO.
ECHO %NAVPATH%service\
ECHO.
ECHO Maybe you already ran recreateoriginalservice.bat
goto :eof
:NavPathOK
IF NOT EXIST "%NAVPATH%service.org\Microsoft.Dynamics.Nav.Server.exe" GOTO orgok
ECHO.
ECHO Directory already exists
ECHO.
ECHO %NAVPATH%service.org\
ECHO.
ECHO Maybe you already ran recreateoriginalservice.bat
GOTO :eof
:orgok
C:
CD "%NAVPATH%"
SC stop MicrosoftDynamicsNavWS
CALL SLEEP.BAT 3
SC stop MicrosoftDynamicsNavServer
CALL SLEEP.BAT 3
SC delete MicrosoftDynamicsNavWS
SC delete MicrosoftDynamicsNavServer
RENAME Service Service.org
CALL createservice DynamicsNAV dummy dummy auto
COPY /Y customsettings.template service.org\customsettings.config
GOTO :eof
:usage
ECHO.
ECHO Usage:
ECHO.
ECHO recreateoriginalservice.bat
ECHO.

A couple of comments to the  "source":

  • %~dp0 returns the directory in which the .BAT file is placed (with trailing backslash C:\Program Files\Microsoft Dynamics NAV\60\)
  • SLEEP.BAT is a small .BAT file which sleeps in a number of seconds (approx.)
  • SC stop <service> - tries to stop a Service (the Service is set in STOP_PENDING mode)
  • SC delete <service> - tries to delete a Service (note that if the Service isn't stopped it will put the Service into a DELETE_PENDING mode which then sometimes requires a server reboot).
  • CustomSettings.template is the original CustomSettings.Config with a few modifications.
  • CreateService.bat is called with two dummy parameters - look below for further explanation

The way this .BAT file works is, that it removes the default installed Service Tier, Renames the Service Directory to Service.org and adds the original Service Tier again (using the CustomSettings.config that already was in the Service Directory). After this, it copies in a new CustomSettings.config template with specific fields that later can be auto-replaced by the CreateService.bat. CreateService will create a Service in a directory called the same as the instance name - so after running RecreateOriginalService.bat you will find 2 new directories in the NAV path: DynamicsNAV and Service.org instead of the original Service directory.

RecreateOriginalService.bat checks whether it has ran already - so please do not create a Service Tier called Service - you can probably guess why by looking at the .bat file. Also you cannot create services called Classic, Database, RoleTailored Client or OutlookAddin - but who wants to do that anyway.

Sleep.bat

As you saw, RecreateOriginalService.bat uses a .BAT file called SLEEP.BAT. The main purpose of this .BAT file is to wait for a number of seconds - and there really isn't any command line tool, which works in all versions of Windows that can do this - so I made this one

@ping 127.0.0.1 -n 2 -w 1000 > nul
@ping 127.0.0.1 -n %1% -w 1000 > nul

Works fine - but is kind of strange to look at (that is why it got its own .BAT file).
Vista, Windows Server 2003 and 2008 has a command called TIMEOUT, but that doesn't work on XP.

CustomSettings.template

The CustomSettings.template is a copy of the original CustomSettings.config with 3 changes:

<add key="DatabaseServer" value="#DBSERVER#"></add>
<add key="DatabaseName" value="#DATABASE#"></add>
<add key="ServerInstance" value="#INSTANCE#"></add>

replacing the original DatabaseServer, DatabaseName and ServerInstance with three "variables".

When CreateService.bat is called it will replace these variables with values given on the command line, this way you can create a Service Tier without having to edit the config file afterwards. (very useful when doing testing on multiple Service Tiers)

CreateService.bat

Now this is the fun stuff...

@ECHO OFF
IF "%1" == "" GOTO usage
SET SERVICE=%1
SET DBSERVER=%2
SET DATABASE=%3
SET START=%4
SET WHICH=%5
IF "%START%" == "" SET START=demand
IF "%START%" == "auto" goto startok
IF "%START%" == "demand" goto startok
IF "%START%" == "disabled" goto startok
ECHO.
ECHO Illegal value for 4th parameter
GOTO usage
:startok
IF "%WHICH%" == "" SET WHICH=both
IF "%WHICH%" == "both" goto whichok
IF "%WHICH%" == "servicetier" goto whichok
IF "%WHICH%" == "ws" goto whichok
ECHO.
ECHO Illegal value for 5th parameter
GOTO usage
:whichok
SET type=own
IF "%WHICH%" == "both" SET type=share
SET NAVPATH=%~dp0
IF EXIST "%NAVPATH%service.org\Microsoft.Dynamics.Nav.Server.exe" GOTO NavPathOK
ECHO.
ECHO Unable to locate original Service directory
ECHO.
ECHO in %NAVPATH%service.org\
ECHO.
ECHO Maybe you need to run recreateoriginalservice.bat
goto :eof
:NavPathOk
IF EXIST "%NAVPATH%%SERVICE%\Microsoft.Dynamics.Nav.Server.exe" GOTO serviceexists
C:
CD "%NAVPATH%"
MKDIR "%SERVICE%"
IF ERRORLEVEL 1 GOTO nodir
XCOPY service.org %SERVICE% /s/e
SET SERVICEDIR=%NAVPATH%%SERVICE%
replacestringinfile.vbs #INSTANCE# %SERVICE% "%SERVICEDIR%\customsettings.config"
IF '%DBSERVER%' == '' GOTO editconfig
replacestringinfile.vbs #DBSERVER# %DBSERVER% "%SERVICEDIR%\customsettings.config"
IF '%DATABASE%' == '' GOTO editconfig
replacestringinfile.vbs #DATABASE# %DATABASE% "%SERVICEDIR%\customsettings.config"
GOTO configdone
:editconfig
NOTEPAD %SERVICEDIR%\customsettings.config
:configdone
SC CONFIG NetTcpPortSharing start= demand
SET DEP=
if "%WHICH%" == "ws" goto onlyws
SC CREATE MicrosoftDynamicsNavServer$%SERVICE% binpath= "%SERVICEDIR%\Microsoft.Dynamics.Nav.Server.exe $%SERVICE%" DisplayName= "NAV Server %SERVICE%" type= %type% start= %START% obj= "NT Authority\NetworkService" depend= NetTcpPortSharing
SET DEP=/MicrosoftDynamicsNavServer$%SERVICE%
if "%WHICH%" == "servicetier" goto notws
:onlyws
SC CREATE MicrosoftDynamicsNavWS$%SERVICE% binpath= "%SERVICEDIR%\Microsoft.Dynamics.Nav.Server.exe $%SERVICE%" DisplayName= "NAV Server %SERVICE% WS" type= %type% start= %START% obj= "NT Authority\NetworkService" depend= HTTP/NetTcpPortSharing%DEP%
:notws
IF "%START%" == "demand" GOTO :eof
IF "%START%" == "disabled" GOTO :eof
if "%WHICH%" == "ws" goto startws
SC START MicrosoftDynamicsNavServer$%SERVICE%
if "%WHICH%" == "servicetier" goto :eof
:startws
SC START MicrosoftDynamicsNavWS$%SERVICE%
goto :eof
:serviceexists
ECHO.
ECHO Service already exists
ECHO.
GOTO :eof
:nodir
ECHO.
ECHO Could not create service directory
ECHO.
GOTO :eof
:usage
ECHO.
ECHO Usage:
ECHO.
ECHO CreateService servicetiername [databaseserver] ["databasename"] [demand^|auto^|disabled] [both^|servicetier^|ws]
ECHO.
ECHO.

As you can see in the usage section, you can start the .BAT file with 5 parameters - but only the first is mandatory.

The first parameter is the Service Tier name, and this becomes the name of the directory which holds the executable and the configuration file for this Service Tier (therefore - that one cannot be defaulted)

You should think that host and databasename are mandatory, but if you look in the .bat file it will actually start up notepad and ask you to complete the config file if you don't specify these parameters (this is the reason why RecreateOriginalService.bat calls CreateService with DynamicsNAV dummy dummy - we don't want notepad - and remember the original config file was still there - so no replacements are made).

CreateService.bat can create both Service Tiers and Web Service listeners and default is to create both (using a shared process). It can set the services to auto start, demand start or to be disabled by default.

The Services created by CreateService.bat are called MicrosoftDynamicsNavServer$<instancename> and MicrosoftDynamicsNavWS$<instancename> and the Service description is Nav Server <instancename> and Nav Server <instancename> WS to make sure that they are listed underneath each other in the services list.

The Web Service listener is created with a dependency to the Service Tier (if you create both), so that when restarting the Service Tier - it automatically restarts the Web Service Listener as well - and both services are created with a dependency to NetTcpPortSharing.

I use the replacestringinfile VB Script (can be found later in this post) to replace the template variables in the config file with the values specifies on the command line.

I guess the best way of describing the functionality of CreateService.bat is to give a bunch of examples - that you can validate against the above source.

C:\Pro....60\>CreateService.bat test

Creates a Service Tier and a Web Service listener with the instance name test and opens Notepad to allow you to specify databaseserver and database name. Both Services are set to start manually and they share one process.

C:\Pro....60\>CreateService.bat test localhost "Demo Database NAV (6-0)"

Creates a Service Tier and a Web Service listener with the instance name test, pointing to the demo database on localhost. Both Services are set to start manually and they share one process.

C:\Pro....60\>CreateService.bat test localhost "Demo Database NAV (6-0)" auto servicetier

Creates a Service Tier with the instance name test, pointing to the demo database on localhost. The Service Tier has its own process and starts automatically.

C:\Pro....60\>CreateService.bat test localhost "Demo Database NAV (6-0)" demand ws

Creates a Web Service listener with the instance name test, pointing to the demo database on localhost. The Service Tier has its own process and is set to start manually.

C:\Pro....60\>CreateService.bat test mydbserver "Demo Database NAV (6-0)" auto servicetier
C:\Pro....60\>CreateService.bat test mydbserver "Demo Database NAV (6-0)" auto ws

Creates a Service Tier and a Web Service listener with the instance name test, pointing to the demo database on mydbserver. Both Services are set to start automatically and they each have their own process.

for /L %p in (1,1,50) DO ( createservice.bat test%p localhost "Demo Database NAV (6-0)" )

Creates 50 Service Tiers and Web Service listeners pointing to the demo database on localhost. All pairs share a process and all are set to demand load. Yes, I know that I am probably the only one in this world who would do something like this - but I just wanted to see how many Service Tiers I could install.

The result of that investigation is "unlimited" - I didn't run into any barrier (of course there is a barrier with memory and disk space) by just installing a huge amount of Service Tiers that are set to start manually.

The picture is totally different if I set the Service Tiers to auto start - approx. 50 started Service Tiers managed to eat my available memory and my machine declined to start more services.

DeleteService.bat

DeleteService really isn't that bad. The majority of work here is to make sure that it is a service tier before killing the service, deleting it and removing the directory structure of the service without asking for permission. Should be safe though...

@ECHO OFF
IF "%1" == "" GOTO usage
SET NAVPATH=%~dp0
IF EXIST "%NAVPATH%service.org\Microsoft.Dynamics.Nav.Server.exe" GOTO NavPathOK
ECHO.
ECHO Unable to locate original Service directory
ECHO.
ECHO in %NAVPATH%service.org\
ECHO.
ECHO Maybe you need to run recreateoriginalservice.bat
goto :eof
:NavPathOk
C:
CD "%NAVPATH%"
IF EXIST "%1\Microsoft.Dynamics.Nav.Server.exe" GOTO serviceexists
ECHO.
ECHO Service doesn't exist
GOTO usage
:serviceexists
SC query MicrosoftDynamicsNavServer$%1 | FINDSTR "STOPPED"
IF NOT ERRORLEVEL 1 GOTO dontstop
SC stop MicrosoftDynamicsNavWS$%1
CALL SLEEP.BAT 3
SC stop MicrosoftDynamicsNavServer$%1
CALL SLEEP.BAT 3
:dontstop
SC delete MicrosoftDynamicsNavWS$%1
SC delete MicrosoftDynamicsNavServer$%1
rd %1 /S /Q
GOTO :eof
:usage
ECHO.
ECHO Usage:
ECHO.
ECHO DeleteService servicename
ECHO.

A couple of comments to the source:

  • rd %1 /S /Q - removes a directory structure without asking for permission
  • SC query MicrosoftDynamicsNavServer$%1 | FINDSTR "STOPPED" will check whether the Service Tier is stopped
  • DeleteService does absolutely nothing to the database - it only unhooks a service and removes the directory in which it was installed.

BTW - if you tried to create the 50 Service Tiers with CreateService - you can delete them using:

for /L %p in (1,1,50) DO ( deleteservice.bat test%p )

ReplaceStringInFile.vbs

As you saw, CreateService needs to replace a "variable" in a file with a value - like #DBSERVER# -> localhost etc. and there is no command line tool to do that. But fortunately we have the Internet - and I found a nice VBScript on http://www.motobit.com/tips/detpg_replfile/ which does exactly what I want:

Dim FileName, Find, ReplaceWith, FileContents, dFileContents
Find         = WScript.Arguments(0)
ReplaceWith  = WScript.Arguments(1)
FileName     = WScript.Arguments(2)

'Read source text file
FileContents = GetFile(FileName)

'replace all string In the source file
dFileContents = replace(FileContents, Find, ReplaceWith, 1, -1, 1)

'Compare source And result
if dFileContents <> FileContents Then
  'write result If different
  WriteFile FileName, dFileContents

  Wscript.Echo "Replace done."
  If Len(ReplaceWith) <> Len(Find) Then 'Can we count n of replacements?
    Wscript.Echo _
    ( (Len(dFileContents) - Len(FileContents)) / (Len(ReplaceWith)-Len(Find)) ) & _
    " replacements."
  End If
Else
  Wscript.Echo "Searched string Not In the source file"
End If

'Read text file
function GetFile(FileName)
  If FileName<>"" Then
    Dim FS, FileStream
    Set FS = CreateObject("Scripting.FileSystemObject")
      on error resume Next
      Set FileStream = FS.OpenTextFile(FileName)
      GetFile = FileStream.ReadAll
  End If
End Function

'Write string As a text file.
function WriteFile(FileName, Contents)
  Dim OutStream, FS

  on error resume Next
  Set FS = CreateObject("Scripting.FileSystemObject")
    Set OutStream = FS.OpenTextFile(FileName, 2, True)
    OutStream.Write Contents
End Function

Please go to the Web Site and rate the article if you use the script - I gave it 5 stars:-)

That's all good - but how do I populate the combo box with Service Tiers in the Role Tailored Client?

When you try to select a new service tier in the Role Tailored Client, you choose Select Server in the Microsoft Dynamics NAV menu:

image

This pops up the Select Server and Company dialog:

image 

But you will notice that the drop down with server names is empty and will only get populated every time you successfully connect to a Service Tier.

If you want to populate this list, you will need to alter the Client Configuration file, which is stored locally on each Client - the path of the file is:

C:\Documents and Settings\<username>\Local Settings\Application Data\Microsoft\Microsoft Dynamics NAV\ on my Windows XP and my Windows 2003 Server box and

C:\Users\<username>\AppData\Local\Microsoft\Microsoft Dynamics NAV\ on my Vista box.

the config file is called ClientUserSettings.config and the key you want to alter is called UrlHistory. If you modify the key to be:

<add key="UrlHistory" value="localhost/DynamicsNAV,localhost/test" />

You will now have these two selections in the dropdown.

That's it for now

Having these .BAT files in place will allow you to manage your Service Tiers easier.

Please remember that these .BAT files are just listed here as examples and there is absolutely no guarantee that they will work for the purpose you want them to. That is however the nice thing about .BAT files - they can be modified using notepad.

Oh yes - and the price for reading this far is, that you get to download the .BAT files from a ZIP file here http://www.freddy.dk/MultipleServiceTiers.zip

I am also working on creating a couple of .BAT files, which should be placed on the Client to allow remote starting of Service Tiers along with startup of the Role Tailored Client (now where we are creating Services as manual start) - more on this in a later post.

I also think that it would be beneficial for people to get info about challenges for installing a Database Server and a Service Tier in a 3T environment, I do think this is described in the Documentation for NAV.

Enjoy

Freddy Kristiansen

PM Architect

Microsoft Dynamics NAV

The Service Tier

$
0
0

My first technical blog post is going to describe some details about the service tier, that some people might find interesting (and some people might think that this is common knowledge:-))

What is the Service Tier?

Very briefly - the Service Tier is the middle tier in a Microsoft Dynamics NAV 2009 installation. This is where all database access is performed and all business logic is executed, meaning also that this is where the application is running. The Database Tier needs to be SQL Server 2005 or higher and the Client Tier needs to be the Role Tailored Client. When installed, the Service Tier does nothing but to wait for a connection from a Role Tailored Client, so even if the Service Tier is started it really doesn't consume a lot of resources until some Clients connects to it.

The Service Tier can also be configured as a Web Service listener making part of your application accessible from any Web Service Consumer. It is not recommended that you expose Web Services Directly on the Internet due to security - but you could easily create a high level proxy for some Web Services and expose those to the Internet.

The Role Tailored Client doesn't connect to the SQL Server directly at all - and the SQL Server could in fact be hidden behind a firewall and be inaccessible from the clients.

When a Client connects to the Service Tier he is authenticated using Windows Authentication and the Service Tier will impersonate the user when connecting to the Database Tier. The Service Tier of course needs permission to do that, we cannot have random computers on the network running around impersonating users - more on this topic later.

What is installed?

When you install Microsoft Dynamics NAV 2009 (The Demo install), the installer will create 2 services for you:

Microsoft Dynamics NAV Business Web Services
Service Name: MicrosoftDynamicsNavWs
This is the Web Service listener. By default this is set to start Manually and without starting this service - you will not be able to do anything with Microsoft Dynamics NAV 2009 Web Services.

Microsoft Dynamics NAV Server
Service Name: MicrosoftDynamicsNavServer
This is the Service Tier. By default this service is set to start Automatically.

Both services are set to run as the NETWORK SERVICE user, and by default the Web Service listener has a dependency on the HTTP protocol.

This is what you can see when looking at properties for the service in the services list (Control Panel -> Administrative Tools -> Services).

Another interesting thing pops up if you query the Service Controller for info about the services:

image

Notice that the services are running with a flag called WIN32_SHARE_PROCESS - meaning that even though you start both services, your task list will only reveal one process running:

image

Why is this important?

First of all - the two processes share caches of metadata, memory consumption and everything - meaning that you will save some memory when running in the same process.

Secondly if you at some point want to restart your service tier in order to flush these caches - you actually need to restart both services to achieve that goal.

Configuration of the Service Tier

The default installation path of Microsoft Dynamics NAV 2009 varies according to version and language of the operating system - but on my machine it is under:

C:\Program Files\Microsoft Dynamics NAV\60\

Note that if you install on a 64bit computer, it will be installed C:\Program Files (x86) because the Service Tier in NAV 2009 is 32bit only.

In this directory you will find a folder called Service (if you installed the demo) and in this directory a configuration file called:

CustomSettings.config

Among the keys in the config file you will find the following 5 keys:

DatabaseServer / DatabaseName
These values are used when the Service Tier connects to the Database Tier - this is the location of your SQL Server and the name of the Database you want to connect to.

After changing these values, you need to restart the Service Tier (both services if started) in order to make the Service Tier connect to a new database.

It seems cumbersome if you are working as a single user using multiple databases - but one Service Tier has one database connection - that is just the way it is.

If you have multiple databases, you want to install multiple Service Tiers - and I will create a post on how to do this. You can also install multiple Service Tiers connecting to the same databases and you can do that on the same computer if you would want that.

ServerInstance / ServerPort / WebServicePort
These are the values that differentiates multiple service tiers on one box.
The default ServerInstance installed by the installer is DynamicsNAV, the default serverport is 7046 and the default WebServicePort is 7047.

Again - for one Service Tier, these values are fine.

On the Client computer you will connect to a Service Tier using the Select Server dialog, in which you specify the URL of the Service Tier as:

localhost/DynamicsNAV

or

localhost:7046/DynamicsNAV

as you can see - the server URL contains the computername of the Service Tier computer, the port and the Service  Tier instance and if you have multiple Service Tiers, one of these has to be different.

Also the WebService listener uses the Instance in the URL needed to connect to WebServices

http://localhost:7047/DynamicsNAV/WS/<companyname>/Services

More about this in a later post on web services.

BTW. Microsoft recommends that you use the Instance name to differentiate between service tiers and not ports.

Is the Service Tier interpreting C/AL code?

As you probably already know, the answer to this question is No. In NAV 2009

The majority of the Service Tier is written in C# and runs managed code and the application is also converted into C# and at runtime executed as compiled managed Code.

The way this happens is, that whenever you compile an object in C/SIDE, the object is behind the scenes compiled to C# (by the classic client) and the C# code is stored in the Object Metadata table (in the BLOB field called User Code). Furthermore the Object Timestamp field in the Object Tracking table is updated allowing the Service Tier to pickup these changes. When the Service Tier needs to run code from an object - the C# code from the object is written to disk and through some magic compiled into a module, which is loaded dynamically, allowing the Service Tier to replace single code units, pages etc. on the fly.

The directory in which the Service Tier stores the C# files can be found in:

<Program Data>\Microsoft\Microsoft Dynamics NAV\60\Server\<process ID>\source

where <Program Data> is

C:\Documents and Settings\All Users\Application Data\ on my XP and my Windows 2003 Server and
C:\Users\All Users\ on my Vista box

and the process ID can be found in the Task Manager by choosing to show the PID column.

Having the C# files available in this directory actually allows you to debug this code as described in Claus Lundstrøm's post:

http://blogs.msdn.com/clausl/archive/2008/10/14/debugging-in-nav-2009.aspx

but note that:

  • You will be debugging the C# code - and you cannot see the AL code
  • You will have to install SP1 on VS2005 or VS2008 in order to debug
  • You are debugging the Service Tier meaning that you debug every connection (can be pretty confusing:-))

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

4 BAT files for the Client Tier

$
0
0

If you haven't done so, please read the post about Multiple Service Tiers before reading this:-)

In my previous post (Multiple Service Tier) I promised to talk about what to do on the Client Tier if you have installed a number of services on a Service Tier computer that all are set to start manually.

Do you really have to go to the Service Tier machine in order to start the services or is there a simpler way?

As you might have guessed - there is (else you wouldn't be reading this post)

The SC command can actually start services on a different box with the syntax:

SC \\machine start servicename

Assuming that we are using the CreateService.bat to create our Service Tier we know what the service name is based on the instance name and we should be able to create 4 BAT files, which enables us to:

1. Start Service Tiers
2. Stop Service Tiers
3. Restart Service Tiers
4. Start the Role Tailored Client using a specific Service Tier (and start the Service Tier if necessary)

The four scripts really makes things easier when dealing with multiple Service Tiers - and here they go...

StartService.bat

@ECHO OFF
IF "%1" == "" GOTO usage
SETLOCAL
SET SERVICETIER=%2
IF NOT "%SERVICETIER%" == "" SET SERVICETIER=\\%SERVICETIER%
SET NAVPATH=%~dp0
SC %SERVICETIER% query MicrosoftDynamicsNavServer$%1 > nul
IF ERRORLEVEL 1 GOTO :eof
SC %SERVICETIER% query MicrosoftDynamicsNavServer$%1 | FINDSTR "RUNNING"
IF NOT ERRORLEVEL 1 GOTO :eof
SC %SERVICETIER% start MicrosoftDynamicsNavServer$%1
CALL "%NAVPATH%SLEEP.BAT" 3
SC %SERVICETIER% start MicrosoftDynamicsNavWS$%1
CALL "%NAVPATH%SLEEP.BAT" 3
GOTO :eof
:usage
ECHO.
ECHO Usage:
ECHO.
ECHO startservice instancename [servicetier]
ECHO.

SETLOCAL means that the changes we do to environment variables here are not reflected outside this .BAT file.
The .BAT files checks whether the service exists and whether it already has been started - if that is the case, there is no real reason for starting it.
Note BTW that you will need the Sleep.bat file described in the Multiple Service Tiers post here as well.

StopService.bat

@ECHO OFF
IF "%1" == "" GOTO usage
SETLOCAL
SET SERVICETIER=%2
IF NOT "%SERVICETIER%" == "" SET SERVICETIER=\\%SERVICETIER%
SET NAVPATH=%~dp0
SC %SERVICETIER% query MicrosoftDynamicsNavServer$%1 > nul
IF ERRORLEVEL 1 GOTO :eof
SC %SERVICETIER% query MicrosoftDynamicsNavServer$%1 | FINDSTR "STOPPED"
IF NOT ERRORLEVEL 1 GOTO :eof
SC %SERVICETIER% stop MicrosoftDynamicsNavWS$%1
CALL "%NAVPATH%SLEEP.BAT" 3
SC %SERVICETIER% stop MicrosoftDynamicsNavServer$%1
CALL "%NAVPATH%SLEEP.BAT" 3
GOTO :eof
:usage
ECHO.
ECHO Usage:
ECHO.
ECHO stopservice instancename [servicetier]
ECHO.

Kind of the same story as StartService.bat - if the Service is stopped, there is no real reason to try and stop it.

RestartService.bat

You probably guessed by now what this .BAT file is doing - so no reason in explaining.

@ECHO OFF
IF "%1" == "" GOTO usage
SETLOCAL
SET SERVICETIER=%2
IF NOT "%SERVICETIER%" == "" SET SERVICETIER=\\%SERVICETIER%
SET NAVPATH=%~dp0
SC %SERVICETIER% query MicrosoftDynamicsNavServer$%1 > nul
IF ERRORLEVEL 1 GOTO :eof
SC %SERVICETIER% query MicrosoftDynamicsNavServer$%1 | FINDSTR "STOPPED"
IF NOT ERRORLEVEL 1 GOTO dontstop
SC %SERVICETIER% stop MicrosoftDynamicsNavWS$%1
CALL "%NAVPATH%SLEEP.BAT" 3
SC %SERVICETIER% stop MicrosoftDynamicsNavServer$%1
CALL "%NAVPATH%SLEEP.BAT" 3
:dontstop
SC %SERVICETIER% start MicrosoftDynamicsNavServer$%1
CALL "%NAVPATH%SLEEP.BAT" 3
SC %SERVICETIER% start MicrosoftDynamicsNavWS$%1
CALL "%NAVPATH%SLEEP.BAT" 3
GOTO :eof
:usage
ECHO.
ECHO Usage:
ECHO.
ECHO restartservice instancename [servicetier]
ECHO.

The .BAT file more or less just does a StopService followed by a StartService.

RTC.bat

Now this is the fun stuff - this .BAT file starts the Role Tailored Client connecting to a specific Service Tier - but first it starts the Service Tier in question if it wasn't already started.

@ECHO OFF
IF "%1" == "" GOTO usage
SETLOCAL
SET COMPANY=%3
REM IF '%COMPANY%' == '' SET COMPANY="CRONUS International Ltd."
IF '%COMPANY%' == '' SET COMPANY="CRONUS USA, Inc."
ECHO.%COMPANY%
SET COMPANY=%COMPANY:"=%
SET COMPANY=%COMPANY:,=#%
:again
SET BEFORE=%COMPANY%
FOR /F "tokens=1* delims= " %%A IN ('ECHO.%COMPANY%') DO (
IF NOT "%%B" == "" SET COMPANY=%%A%%20%%B
)
IF NOT "%BEFORE%" == "%COMPANY%" GOTO again
SET COMPANY=%COMPANY:#=,%
SET MACHINE=%2
SET SERVICETIER=%MACHINE%
IF "%SERVICETIER%" == "" SET SERVICETIER=localhost
CALL STARTSERVICE.BAT %1 %SERVICETIER%
START dynamicsnav://%SERVICETIER%/%1/%COMPANY%/
GOTO :eof
:usage
ECHO.
ECHO Usage:
ECHO.
ECHO RTC instancename [servicetier] ["Company"]
ECHO.

As you can see in the usage section, RTC can be started specifying the Service Tier instance name, the Service Tier machine and the Company name.

Launching a Role Tailored Client pointing to the default installed Service Tier would be:

C:\Prog...\60\>RTC DynamicsNAV localhost "CRONUS International Ltd."

or just

C:\Prog...\60\>RTC DynamicsNAV

because localhost and CRONUS are the default values for the 2nd and 3rd parameters. No real reason for that except for the fact that this has been working for me.

Note, that the FOR loop in the .BAT file replaces spaces in the COMPANY variable with %20. Another thing to notice is, that I replace commas with # before the loop (since commas will change the loop) and replace them back afterwards, meaning that you cannot have # in your Company names. You also cannot have " in your company name.

If you have other special characters or if you do have # in your company names you would have to change the name mangling on the COMPANY name.

I use this .BAT file for a number of shortcuts on my desktop to launch Role Tailored Clients to specific Service Tiers and on my Service Tier box I run a command every night shutting down all my service Tiers:

for /f %D in ('dir /ad/b') do ( CALL STOPSERVICE.BAT %D )

The .BAT files can of course also be used on the Service Tier (except for the RTC one) for stopping and starting services - but they are a big help on the Client Tier - the way I am running things at least.

Once again - the people who has read all the way to the end will be able to download a ZIP file containing the .BAT files from http://www.freddy.dk/4ForTheClientTier.zip this ZIP file also contains a .BAT file called StopServices.bat (which actually only runs on the Service Tier = Stops all Running NAV Service Tiers).

I promise this is my last post with .BAT files:-)

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

Viewing all 161 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>