You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/advanced-analytics/java/extension-java.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Run Java code on SQL Server 2019 using the Java language extension.
4
4
ms.prod: sql
5
5
ms.technology: machine-learning
6
6
7
-
ms.date: 11/29/2018
7
+
ms.date: 12/07/2018
8
8
ms.topic: conceptual
9
9
author: HeidiSteen
10
10
ms.author: heidist
@@ -42,7 +42,7 @@ On Windows, we recommend installing the JDK under the default /Program Files/ fo
42
42
43
43
## Install on Linux
44
44
45
-
You can install the [database engine and the Java extension together](../../linux/sql-server-linux-setup-machine-learning.md#chained-installation), or add Java support to an existing instance. The following examples add the Java extension to an existing installation.
45
+
You can install the [database engine and the Java extension together](../../linux/sql-server-linux-setup-machine-learning.md#install-all), or add Java support to an existing instance. The following examples add the Java extension to an existing installation.
When using the [Java language extension](extension-java.md), the [sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) system stored procedure is the interface used to call the Java runtime. Permissions on the database apply to Java code execution.
18
18
19
19
This article explains implementation details for Java classes and methods that execute on SQL Server. Once you are familiar with these details, review the [Java sample](java-first-sample.md) as your next step.
20
20
21
-
## Basic principles
21
+
###Basic principles
22
22
23
23
* Compiled custom Java classes must exist in .class files or .jar files in your Java classpath. The [CLASSPATH parameter](#set-classpath) provides the path to the compiled Java files.
24
24
@@ -29,15 +29,15 @@ This article explains implementation details for Java classes and methods that e
29
29
* "params" is used to pass parameters to a Java class. Calling a method that requires arguments is not supported, which makes parameters the only way to pass argument values to your method.
30
30
31
31
> [!Note]
32
-
> This note restates supported and unsupported operations specific to Java in CTP 2.0.
32
+
> This note restates supported and unsupported operations specific to Java in CTP 2.x.
33
33
> * On the stored procedure, input parameters are supported. Output parameters are not.
34
34
> * Streaming using the sp_execute_external_script parameter **@r_rowsPerRead** is not supported.
35
35
> * Partitioning using **@input_data_1_partition_by_columns** is not supported.
36
36
> * Parallel processing using **@parallel=1** is supported.
37
37
38
-
## Call sp_execute_external_script
38
+
###Call sp_execute_external_script
39
39
40
-
The[sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) system stored procedure is the interface used to call the Java runtime. The following example shows an sp_execute_external_script using the Java extension, and parameters for specifying path, script, and your custom code.
40
+
Applicable to both Windows and Linux, the[sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) system stored procedure is the interface used to call the Java runtime. The following example shows an sp_execute_external_script using the Java extension, and parameters for specifying path, script, and your custom code.
41
41
42
42
```sql
43
43
DECLARE @myClassPath nvarchar(30)
@@ -57,7 +57,7 @@ EXEC sp_execute_external_script
57
57
58
58
<a name="set-classpath"></a>
59
59
60
-
## Set CLASSPATH
60
+
### Set CLASSPATH
61
61
62
62
Once you have compiled your Java class or classes and placed the .class file(s) or .jar files in your Java classpath, you have two options for providing the classpath to the SQL Server Java extension:
63
63
@@ -72,27 +72,27 @@ One approach for specifying a path to compiled code is by setting CLASSPATH as a
72
72
73
73
Just as you created a system variable for the JDK executables, you can create a system variable for code paths. To do this, created a system environment variable called "CLASSPATH"
74
74
75
-
## Class requirements
75
+
### Class requirements
76
76
77
77
In order for SQL Server to communicate with the Java runtime, you need to implement specific static variables in your class. SQL Server can then execute a method in the Java class and exchange data using the Java language extension.
78
78
79
79
> [!Note]
80
80
> Expect the implementation details to change in upcoming CTPs as we work to improve the experience for developers.
81
81
82
-
## Method requirements
82
+
### Method requirements
83
83
To pass arguments, use the **@param** parameter in sp_execute_external_script. The method itself cannot have any arguments. The return type must be void.
84
84
85
85
```java
86
86
public static void test() {}
87
87
```
88
88
89
-
## Data inputs
89
+
### Data inputs
90
90
91
91
This section explains how to push data to Java from a SQL Server query using **InputDataSet** in sp_execute_external_script.
92
92
93
93
For every input column your SQL query pushes to Java, you need to declare an array.
94
94
95
-
### inputDataCol
95
+
#### inputDataCol
96
96
97
97
In the current version of the Java extension, the **inputDataColN** variable is required, where *N* is the column number.
These array variables will be populated with input data from a SQL server query before execution of the Java program you are calling.
108
108
109
-
### inputNullMap
109
+
#### inputNullMap
110
110
111
111
Null map is used by the extension to know which values are null. This variable will be populated with information about null values by SQL Server before execution of the user function.
112
112
@@ -116,30 +116,30 @@ The user only needs to initialize this variable (and the size of the array needs
116
116
public static boolean[][] inputNullMap = new boolean[1][1];
117
117
```
118
118
119
-
## Data outputs
119
+
### Data outputs
120
120
121
121
This section describes **OutputDataSet**, the output data sets returned from Java, which you can send to and persist in SQL Server.
122
122
123
123
> [!Note]
124
124
> Output parameters in [sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) are not supported in this version.
125
125
126
-
### outputDataColN
126
+
#### outputDataColN
127
127
128
128
Similar to **inputDataSet**, for every output column your Java program sends back to SQL Server, you must declare an array variable. All **outputDataCol** arrays should have the same length. You need to make sure this is initialized by the time the class execution finishes.
129
129
130
130
```java
131
131
public static <type>[] outputDataColN = new <type>[]
132
132
```
133
133
134
-
### numberofOutputCols
134
+
#### numberofOutputCols
135
135
136
136
Set this variable to the number of output data columns you expect to have when the user function finishes execution.
137
137
138
138
```java
139
139
public static short numberofOutputCols = <expected number of output columns>;
140
140
```
141
141
142
-
### outputNullMap
142
+
#### outputNullMap
143
143
144
144
Null map is used by the extension to indicate which values are null. We require this since primitive types don't support null. Currently, we also require the null map for String types, even though Strings can be null. Nullvalues are indicated by "true".
145
145
@@ -148,6 +148,8 @@ This NullMap must be populated with the expected number of columns and rows you
Copy file name to clipboardExpand all lines: docs/advanced-analytics/what-s-new-in-sql-server-machine-learning-services.md
+5-4Lines changed: 5 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: New feature announcements for each release of SQL Server 2016 R Ser
4
4
ms.prod: sql
5
5
ms.technology: machine-learning
6
6
7
-
ms.date: 11/06/2018
7
+
ms.date: 12/07/2018
8
8
ms.topic: conceptual
9
9
author: HeidiSteen
10
10
ms.author: heidist
@@ -23,9 +23,10 @@ This release adds the top-requested features for R and Python machine learning o
23
23
24
24
| Release | Feature update |
25
25
|---------|----------------|
26
-
| CTP 2.0 | Linux platform support for R and Python machine learning, plus the new Java extension. For help getting started, see [Install SQL Server Machine Learning Services on Linux](../linux/sql-server-linux-setup-machine-learning.md). |
27
-
| CTP 2.0 | The [sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) includes two new parameters that enable you to easily generate multiple models from partitioned data. Learn more in this tutorial, [Create partition-based models in R](tutorials/r-tutorial-create-models-per-partition.md). |
28
-
| CTP 2.0 | Failover cluster support is now supported on Windows and Linux, assuming SQL Server Launchpad service is started on all nodes. For more information, see [SQL Server failover cluster installation](../sql-server/failover-clusters/install/sql-server-failover-cluster-installation.md). |
26
+
| CTP 2.0 | Linux platform support for R and Python machine learning. Get started with [Install SQL Server Machine Learning Services on Linux](../linux/sql-server-linux-setup-machine-learning.md). |
27
+
||[Java language extension](java/extension-java.md) on both Windows and Linux is new in SQL Server 2019 preview. You can make compiled Java code available to SQL Server by assigning permissions and setting the path. Client apps with access SQL Server can use data and run your code by calling [sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql), the same procedure used for R and Python integration on SQL Server. |
28
+
|| The [sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) introduces two new parameters that enable you to easily generate multiple models from partitioned data. Learn more in this tutorial, [Create partition-based models in R](tutorials/r-tutorial-create-models-per-partition.md). |
29
+
|| Failover cluster support is now supported on Windows and Linux, assuming SQL Server Launchpad service is started on all nodes. For more information, see [SQL Server failover cluster installation](../sql-server/failover-clusters/install/sql-server-failover-cluster-installation.md). |
Copy file name to clipboardExpand all lines: docs/big-data-cluster/big-data-cluster-create-apps.md
+8-7Lines changed: 8 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ description: Deploy a Python or R script as an application on SQL Server 2019 bi
5
5
author: TheBharath
6
6
ms.author: bharaths
7
7
manager: craigg
8
-
ms.date: 12/06/2018
8
+
ms.date: 12/07/2018
9
9
ms.topic: conceptual
10
10
ms.prod: sql
11
11
ms.custom: seodec18
@@ -15,7 +15,7 @@ ms.custom: seodec18
15
15
16
16
This article describes how to deploy and manage R and Python script as an application inside a SQL Server 2019 big data cluster (preview).
17
17
18
-
R and Python applications are deployed and managed with the **mssqlctl-pre** command-line utility which is included in CTP 2.1. This article provides examples of how to deploy these R and Python scripts as apps from the command line.
18
+
R and Python applications are deployed and managed with the **mssqlctl-pre** command-line utility which is included in CTP 2.2. This article provides examples of how to deploy these R and Python scripts as apps from the command line.
19
19
20
20
## Prerequisites
21
21
@@ -26,12 +26,12 @@ You must have a SQL Server 2019 big data cluster configured. For more informatio
26
26
The **mssqlctl-pre** command-line utility is provided to preview the Python and R application deployment feature. Use the following command to install the utility:
In CTP 2.1 you can create, delete, list, and run an R or Python application. The following table describes the application deployment commands that you can use with **mssqlctl-pre**.
34
+
In CTP 2.2 you can create, delete, list, and run an R or Python application. The following table describes the application deployment commands that you can use with **mssqlctl-pre**.
35
35
36
36
| Command | Description |
37
37
|---|---|
@@ -51,15 +51,16 @@ The following sections describe these commands in more detail.
51
51
52
52
## Log in
53
53
54
-
Before configuring R and Python applications, first log into your SQL Server big data cluster with the `mssqlctl-pre login` command. Specify the IP address (external) of the `service-proxy-lb` (for example: `https://ip-address:30777`) along with the user name and password to the cluster.
54
+
Before configuring R and Python applications, first log into your SQL Server big data cluster with the `mssqlctl-pre login` command. Specify the external IP address of the `service-proxy-lb` or `service-proxy-nodeport` services (for example: `https://ip-address:30777`) along with the user name and password to the cluster.
55
+
56
+
You can get the IP address of the **service-proxy-lb** or **service-proxy-nodeport** service by running this command in a bash or cmd window:
55
57
56
-
You can get the IP address of the service-proxy-lb service by running this command in a bash or cmd window:
57
58
```bash
58
59
kubectl get svc service-proxy-lb -n <name of your cluster>
Copy file name to clipboardExpand all lines: docs/big-data-cluster/big-data-cluster-release-notes.md
+64-2Lines changed: 64 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ description: This article describes the latest updates and known issues for SQL
5
5
author: rothja
6
6
ms.author: jroth
7
7
manager: craigg
8
-
ms.date: 12/06/2018
8
+
ms.date: 12/07/2018
9
9
ms.topic: conceptual
10
10
ms.prod: sql
11
11
ms.custom: seodec18
@@ -17,12 +17,74 @@ This article provides the latest updates and known issues for the latest release
17
17
18
18
| Release | Date |
19
19
|---|---|
20
+
|[CTP 2.2](#ctp22)| December 2018 |
20
21
|[CTP 2.1](#ctp21)| November 2018 |
21
22
|[CTP 2.0](#ctp20)| October 2018 |
22
23
23
-
24
24
[!INCLUDE [Limited public preview note](../includes/big-data-cluster-preview-note.md)]
25
25
26
+
## <aid="ctp22"></a> CTP 2.2 (December 2018)
27
+
28
+
The following sections describe the new features and known issues for big data clusters in SQL Server 2019 CTP 2.2.
29
+
30
+
### What's in the CTP 2.2 release?
31
+
32
+
- Use SparkR from Azure Data Studio on a big data cluster.
33
+
- Cluster Admin Portal accessed with `/portal` (**https://\<ip-address\>:30777/portal**).
34
+
- Master pool service name changed from `service-master-pool-lb` and `service-master-pool-nodeport` to `endpoint-master-pool`.
35
+
- New version of **mssqlctl** and updated images.
36
+
- Miscellaneous bug fixes and improvements.
37
+
38
+
### Known issues
39
+
40
+
The following sections provide known issues for SQL Server big data clusters in CTP 2.2.
41
+
42
+
#### Deployment
43
+
44
+
- Upgrading a big data data cluster from a previous release is not supported. You must backup and delete any existing big data cluster before deploying the latest release. For more information, see [Upgrade to a new release](deployment-guidance.md#upgrade).
45
+
46
+
- After deploying on AKS, you might see the following two warning events from the deployment. Both of these events are known issues, but they do not prevent you from successfully deploying the big data cluster on AKS.
47
+
48
+
`Warning FailedMount: Unable to mount volumes for pod "mssql-storage-pool-default-1_sqlarisaksclus(c83eae70-c81b-11e8-930f-f6b6baeb7348)": timeout expired waiting for volumes to attach or mount for pod "sqlarisaksclus"/"mssql-storage-pool-default-1". list of unmounted volumes=[storage-pool-storage hdfs storage-pool-mlservices-storage hadoop-logs]. list of unattached volumes=[storage-pool-storage hdfs storage-pool-mlservices-storage hadoop-logs storage-pool-java-storage secrets default-token-q9mlx]`
49
+
50
+
`Warning Unhealthy: Readiness probe failed: cat: /tmp/provisioner.done: No such file or directory`
51
+
52
+
- If a big data cluster deployment fails, the associated namespace is not removed. This could result in an orphaned namespace on the cluster. A workaround is to delete the namespace manually before deploying a cluster with the same name.
53
+
54
+
#### External tables
55
+
56
+
- It is possible to create a data pool external table for a table that has unsupported column types. If you query the external table, you get a message similar to the following:
57
+
58
+
`Msg 7320, Level 16, State 110, Line 44 Cannot execute the query "Remote Query" against OLE DB provider "SQLNCLI11" for linked server "(null)". 105079; Columns with large object types are not supported for external generic tables.`
59
+
60
+
- If you query a storage pool external table, you might get an error if the underlying file is being copied into HDFS at the same time.
61
+
62
+
`Msg 7320, Level 16, State 110, Line 157 Cannot execute the query "Remote Query" against OLE DB provider "SQLNCLI11" for linked server "(null)". 110806;A distributed query failed: One or more errors occurred.`
63
+
64
+
#### Spark and notebooks
65
+
66
+
- POD IP addresses may change in the Kubernetes environment as PODs restarts. In the scenario where the master-pod restarts, the Spark session may fail with `NoRoteToHostException`. This is caused by JVM caches that don't get refreshed with new IP addresses.
67
+
68
+
- If you have Jupyter already installed and a separate Python on Windows, Spark notebooks might fail. To work around this issue, upgrade Jupyter to the latest version.
69
+
70
+
- In a notebook, if you click the **Add Text** command, the text cell is added in preview mode rather than edit mode. You can click on the preview icon to toggle to edit mode and edit the cell.
71
+
72
+
#### HDFS
73
+
74
+
- If you right-click on a file in HDFS to preview it, you might see the following error:
75
+
76
+
`Error previewing file: File exceeds max size of 30MB`
77
+
78
+
Currently there is no way to preview files larger than 30 MB in Azure Data Studio.
79
+
80
+
- Configuration changes to HDFS that involve changes to hdfs-site.xml are not supported.
81
+
82
+
#### Security
83
+
84
+
- The SA_PASSWORD is part of the environment and discoverable (for example in a cord dump file). You must reset the SA_PASSWORD on the master instance after deployment. This is not a bug but a security step. For more information on how to change the SA_PASSWORD in a Linux container, see [Change the SA password](../linux/quickstart-install-connect-docker.md#sapassword).
85
+
86
+
- AKS logs may contain SA password for big data cluster deployments.
87
+
26
88
## <aid="ctp21"></a> CTP 2.1 (November 2018)
27
89
28
90
The following sections describe the new features and known issues for big data clusters in SQL Server 2019 CTP 2.1.
Copy file name to clipboardExpand all lines: docs/big-data-cluster/cluster-admin-portal.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ description: Learn how to use the cluster administration portal to monitor SQL S
5
5
author: yualan
6
6
ms.author: alayu
7
7
manager: craigg
8
-
ms.date: 12/06/2018
8
+
ms.date: 12/07/2018
9
9
ms.topic: conceptual
10
10
ms.prod: sql
11
11
ms.custom: seodec18
@@ -26,10 +26,10 @@ The cluster administration portal allows you to:
26
26
27
27
Follow the [quickstart to deploy your big data cluster](quickstart-big-data-cluster-deploy.md) until you get to the **cluster administration portal** section. Once you have the big data cluster running with mssqlctl, follow these instructions:
28
28
29
-
Once the controller pod is running, you can use the cluster administration portal to monitor the deployment. You can access the portal using the external IP address and port number for the `service-proxy-lb` (for example: **https://\<ip-address\>:30777**). Credentials for accessing the admin portal are the values of `CONTROLLER_USERNAME` and `CONTROLLER_PASSWORD` environment variables provided above.
29
+
Once the controller pod is running, you can use the cluster administration portal to monitor the deployment. You can access the portal using the external IP address and port number for the `service-proxy-lb` (for example: **https://\<ip-address\>:30777/portal**). Credentials for accessing the admin portal are the values of `CONTROLLER_USERNAME` and `CONTROLLER_PASSWORD` environment variables provided above.
30
30
31
31
> [!NOTE]
32
-
> For CTP 2.1, There is a security warning when accessing the web page since it is using auto-generated SSL certificates.
32
+
> For CTP 2.2, There is a security warning when accessing the web page since it is using auto-generated SSL certificates.
0 commit comments